entropy

[My Big TOE Definition]

The term entropy is generally used to describe the degree of disorder, randomness or uncertainty in a system. In My Big TOE, the term is used to describe the quality of consciousness of an individual being or the Larger Consciousness System (LCS) as a whole.

The concept of entropy was first used in the 19th century in physics (in thermodynamics, which is the branch of physics dealing with the relationship between heat, work, temperature and energy). Later, the concept was also introduced in other fields, including information theory and sociology.

In its most general sense, entropy is defined as follows:

  • High entropy = lots of randomness, disorder, uncertainty
  • Low entropy = little randomness, disorder, uncertainty

My Big TOE, as a theory of consciousness, chiefly uses the term as follows:

  • High entropy = low quality of consciousness
  • Low entropy = high quality of consciousness

At first glance, the My Big TOE definition has little to do with the general usage of the term. Thus, to understand why entropy is such a fitting metaphor for My Big TOE, it is helpful to see the implications of how exactly the term is used in other disciplines.

Physics

In physics, the entropy of a system can be looked at from three different but related perspectives: in terms of (1) disorder, (2) unavailability of energy to do work, and (3) uncertainty.

1. Entropy as a measure of disorder

Commonly, the term entropy is used to describe the degree of disorder within a physical system. This is most obvious in relation to the system’s temperature: the hotter something is, the greater is the motion energy and thus the disorder of its molecules, and therefore the higher is its entropy. A gas has higher entropy than a liquid, and a liquid has higher entropy than a solid. A hotter gas has higher entropy than a cooler gas.

The curious thing about physical systems is that their entropy naturally increases over time. This fact is described by the second law of thermodynamics: it says that, left on their own, all physical systems tend toward disorder. Applied to temperature, this means that heat naturally flows from a hotter body to a cooler one, never the other way around. This is why, at room temperature, an ice cube will always melt but water will never spontaneously congeal: the heat energy flows from the air in the room to the ice cube (until the ice cube has melted and has the same temperature as the room), but not from lukewarm water to the air in the room (assuming that both already have the same temperature).

We can, of course, congeal water by putting it into a freezer, but that means we have to put in work: the freezer will produce more heat in the process than it extracts from the water, thus leading to an overall increase in entropy. Entropy can only be decreased locally (i.e., inside the freezer) – the total entropy of a system (i.e., the kitchen) always increases (at least in physical systems).

2. Entropy as the unavailability to do work

Everybody knows that to do work, you need energy. Most people have also heard of the law of energy conservation: energy can neither be created nor destroyed – it can only be transferred. So when we colloquially speak of “energy consumption”, that is not really correct. Performing work does not involve a consumption of energy but a transfer of energy. This can happen in one of two ways:

  • from one place to another (for example, the motion energy transferred from a baseball player’s bat to the ball)
  • from one form of energy to another (for example, the chemical energy of gasoline/petrol converted into the motion energy of a car).

No energy transfer is 100 percent efficient. Hitting a baseball creates friction between the bat and the ball, releasing heat energy into the air. A combustion engine, too, produces waste heat that is dispersed into the environment. In both cases, the energy converted into heat doesn’t contribute to the intended work – instead, it leads to an increase in entropy. This increase in entropy is irreversible because the amount of energy converted into heat becomes permanently unavailable to do work.

Thus, while during an energy transfer the total amount of energy is always conserved, the total amount of entropy is not. Entropy, therefore, can be defined as a measure of a system’s thermal energy unavailable for doing useful work. In other words, the lower a system’s entropy, the greater its capacity to do work.

Or, put more generally:

  • High entropy = low capacity to do work
  • Low entropy = great capacity to do work

3. Entropy as a measure of uncertainty

The disorder associated with the concept of entropy can also be seen in terms of uncertainty: the more disordered a system is, the more uncertain we are about the exact state it is in. This is because there are only a few ways in which a system can be orderly but many ways in which it can be disorderly.

A hot gas, as we have seen, has relatively high entropy because its molecules bounce around in all directions. It’s highly uncertain where they will be at any given moment – it’s much easier to pinpoint the position of a molecule in a solid. Thus, we can propose a statistical definition of entropy: higher entropy represents greater randomness and greater uncertainty.

To illustrate the point, take a deck of cards: when unwrapping a brand-new deck for the first time, you will find each suit to be ordered in sequence, from ace to king. This state of “order” is unique – every single card must be in exactly the right place. However, when you shuffle the cards, the desired state of “disorder” can be produced by a much bigger number of different arrangements – a number so great, in fact, that no amount of shuffling will ever restore the deck to perfect order in anyone’s lifetime.

The whole point of shuffling, as any child knows, is to create randomness and thus uncertainty about who holds which cards, and which card will be picked next from the pack. (One could also say that the whole point of randomness and uncertainty is to create a lack of information about which card is where.) Then, once the cards have been dealt and you pick up your hand, the first thing you do is reduce that randomness and uncertainty again: you sort your cards to see which ones go together. Reducing uncertainty increases your ability to do work – it generates the information required for strategizing how best to play your hand and win the game.

Information systems

We have seen how in physics, the concept of entropy links together the notions of disorder, randomness, uncertainty and the ability to do work. It does just the same in the field of information, while also adding the notion of meaning to the mix.

Meaning is what distinguishes information from data. Data may be purely physical, such as words printed on a page. But the meaning of that data – its information value – is non-physical. Interpreting data and understanding its meaning takes a consciousness. Computers can store, print and even read out loud entire books, but they can’t understand their meaning because they aren’t conscious.

My Big TOE is a theory of consciousness: it describes consciousness as an evolving, self-modifying digital information system – the Larger Consciousness System (LCS).

An information system stores and processes information. The more information it stores and processes – and the more useful, valuable and meaningful that information is – the more powerful the system becomes: it evolves. If the system loses information, on the other hand, it de-evolves.

Information, order and entropy

At the most basic level, information is order. If you have random bits of data, you can’t discern any information from that data – there’s no information in randomness. But when the bits become ordered and that order has meaning, then that data carries information.

Take a simple pattern. If we see the beginning of a binary sequence such as “up-down-up-down-up” we can predict with a fair degree of certainty that the next element in that sequence is going to be “down”. The pattern carries information that enables us to make an accurate prediction. If “up” and “down” refer to the work of a machine, for example, then being able to predict what the machine will do next can be very useful.

Since information represents order, structure and meaning, information is inversely related to entropy:

  • High entropy = little information value, content, meaning
  • Low entropy = lots of information value, content, meaning

This means that information systems evolve by lowering their entropy. A self-aware, self-modifying information system like the LCS will naturally choose to lower its entropy because that means creating more information, value and meaning as well as more awareness and greater capacity to do things.

Entropy, diversity and complexity

Creating information doesn’t merely mean organizing bits into patterns, though. Patterns are highly ordered but they have limited information value – there’s more to order and structure than regularity and repetition. The black and white squares on a chessboard, for example, are ordered as rigidly as they could possibly be – in a certain way, they represent low entropy. Yet they don’t contain much information: all they do is make it easier to see where the pieces can move.

The chess game itself makes this even clearer. The starting position of the 32 pieces is more orderly than almost any other position during the game, yet it has great limitations: only the knights and pawns can move, and no piece can yet attack the other side. As the game develops, both players’ positions appear less orderly, especially to someone who doesn’t know the rules. But at the same time the positions tend to be more powerful: the pieces have more options to attack while also better covering each other. Thus, the strength of a player’s position depends not on the apparent order in terms of regular patterns but on the relative position of all the pieces in accordance with the rules.

This is a crucial point: in the field of information, the notion of entropy is relative to the context. To use another example: if you know the rules of the English language, then you’ll find a lot of information on this web page. Someone who doesn’t know English, in contrast, will glean little to no information from this page – the way the letters are arranged into words and the words into sentences means nothing to them. This is how language works: we can only convey information if we use words with an agreed-upon meaning according to the agreed-upon rules of grammar.

Social systems

The LCS is not only an information system, but also a social system – it consists of countless Individuated Units of Consciousness (IUOCs) interacting with each other. How do social systems function?

  • Social systems function optimally if their participants cooperate – if they help and support each other and share the available resources to meet everybody’s needs.
  • Social systems don’t function well if their participants compete for resources – if they try to impede each other’s development and to take most resources for themselves.

We can thus apply the concept of entropy to social systems by referring to the previously mentioned aspects of order, structure, the capacity to do useful work, certainty, meaning, and commonly accepted rules:

  • A well-functioning social system represents low entropy. It is productive (people care about each other and cooperate for the greater good) and preserves its gains and resources (nobody wants to take things away from others or take more for themselves than they need). It creates synergies: the total is greater than the sum of its parts. There’s little uncertainty about other people’s good intentions.
  • A dysfunctional social system represents high entropy: it is fragile and only partially productive (because of regular inhibition and destruction caused by recurrent conflict and the failure to share knowledge and resources). There’s a lot of uncertainty and mistrust about what people may be up to – even if they seem nice, they could only be pretending and have ulterior motives.

Look at the state of the world today and you’ll have no doubt about which of the above best describes the human predicament. On average, human beings are highly fear- and ego-driven, trying to gain and keep as much as possible for themselves and their family, friends, community and country. Few people would disagree that the planet is in a state of disorder and uncertainty and its ability to do useful work is much lower than it could be.

What is it, then, that makes a social system functional or dysfunctional? Most people would tend to say that it’s the rules of governance: for instance, whether the political system is authoritarian or a healthy democracy.

My Big TOE suggests that such considerations are secondary. What’s of primary importance for the functioning of a social system is the average quality of its participants: whether they are egotistical and self-centered or compassionate and caring – or somewhere in between.

This is a key reason why My Big TOE links the concept of entropy not only to the system level but also to each individual person.

Usage of the term “entropy” in My Big TOE

My Big TOE describes consciousness as a self-modifying information system. From this follows that consciousness has entropy. This applies to each individual consciousness (IUOC) as well as the Larger Consciousness System (LCS) as a whole.

As mentioned before, My Big TOE describes the entropy of consciousness as follows:

  • “Lowering entropy” is a synonym for spiritual evolution/spiritual growth.

Actions that lower entropy are expressions of an intent to build and construct: they are achieved through caring and cooperation – when IUOCs are of service to each other. This is the way of love. The opposite happens when the interaction is driven by fear.

My Big TOE defines love as a positive intent focused on other, and fear as a negative expectation or belief focused on self. Love is defined in My Big TOE as the nature of a low-entropy being and fear is defined as the nature of a high-entropy being.

Overall system entropy, then, is a result of one’s personal motivation, expressed through one’s intent when we make choices:

  • Choices motivated by love usually lead to cooperation and sharing (low entropy): “How can I help?” “How can I be of service?”
  • Choices motivated by fear usually lead to non-cooperation and competition for resources (high entropy): “What can I get?” “What’s in it for me?”

Our intent is an expression of our individual entropy (quality of consciousness). As we stated above, My Big TOE relates entropy and quality of consciousness as follows:

  • low entropy = high quality of consciousness
  • high entropy = low quality of consciousness

This, then, is how entropy is related to love and fear:

  • Love (concern about other) is the natural expression of a low-entropy consciousness.
  • Fear (concern about self) is the natural expression of a high-entropy consciousness.

Spiritual growth, then, is defined by My Big TOE as evolving one’s quality of consciousness from fear to love. Finally, we can equate all the terms and definitions we have used so far in the following way:

lowering one’s entropy = spiritual growth/evolution = growing/evolving one’s quality of consciousness = getting rid of fear = becoming love

The entropy metaphor’s explanatory power

The idea of consciousness being an information system seeking to lower its entropy is central to My Big TOE – so central, in fact, that it is key to the way the model answers life’s most fundamental questions:

  • Why do we exist? The LCS split up into countless IUOCs as part of its strategy to lower its entropy. Each IUOC is an interconnected part of the LCS. If we lower our individual entropy, the collective entropy gets lowered just a little bit – and this is one of the major ways the system can evolve. We are part of the system’s evolutionary strategy.
  • Why does the universe exist? PMR is an entropy-reduction trainer for IUOCs. It gives us an environment in which we can make meaningful choices and then learn and grow from those choices.
  • What is our life’s purpose? Our purpose is to lower our individual entropy and become love. Love is caring, cooperating, helping, connecting, constructing; love is the natural expression of a low-entropy consciousness; becoming love is the goal and purpose of our existence.

Additionally, the My Big TOE entropy metaphor works on two more levels.

If you define entropy as a measure of randomness and disorder, then being a high-entropy individual can be compared to having a scattered mind. On average, high-entropy beings tend to have greater difficulty concentrating, and less focus and awareness. They may often feel overwhelmed by uncontrollable thoughts. This can be linked back to the dichotomy of fear and love: a fickle, unstable mind is driven by worries about the future and regrets about the past. Both are caused by fear. Worry, for example, may be caused by the fear of the unknown; regrets may be triggered by the fear of not being good enough. The less fearful you are, the less need for worry and regret you have and the more peaceful your mind is.

One’s state of mind, finally, is closely related to one’s capacity to do work. Beings of high quality/low entropy have a greater ability to focus on what is important. They are less perturbed and stay calm even in the midst of great turmoil. Their choices are less influenced by ego and beliefs, and they care more about others than about themselves. As a result, they have a greater decision space, a natural tendency to make good choices within that decision space, and therefore a greater capacity to be productive and helpful.

« Go to Glossary Index