Societal entropy

From My Strategy
Jump to navigation Jump to search

Chapter 1 - Worldview


Previous page: Time - Societal entropy - Next page: Business entropy


Back to Book content or directly to Main Page or Worldview


Societal entropy

.

Welcome to the Societal entropy page

.

Your personal context

.

In daily life, you experience an ever-growing 'environment'. Your activities keep coming in, but few leave. We call this societal entropy, which has consequences for your social life.

.

Key take-aways from the deep dive

  • Higher entropy in the environment/context ensures higher brain activity/entropy
  • Life tries to minimise entropy because it costs us energy

.

Deep dive

Below, you will find a description of different forms of entropy that influence society.

In the brain

Intense complexity and irregular variability in brain activity from one moment to the next, markes more significant long-distance correlations in neural activity.

Greater entropy, up to a point, is indicative of more information processing capacity, as opposed to low entropy – characterised by orderliness and repetition – which is seen when we are in deep sleep or coma.

Increased brain entropy indicates increased brain activity, suggesting an increase in information processing capacity in the brain.

.

In communication

The core idea is that the "informational value" of a communicated message depends on the degree to which the content of the message is surprising.

  • If a highly likely event occurs, the message carries very little information.
  • On the other hand, if a highly unlikely event occurs, the message is much more informative.

For instance, the knowledge that some particular number will not be the winning number of a lottery provides very little information, because any particular chosen number will almost certainly not win. However, knowledge that a particular number will win a lottery has high informational value because it communicates the outcome of a very low probability event.

With a low entropy, there is no uncertainty: one has full knowledge of what is to come, and this information, therefore, does not contain "news". On the other hand, with a high entropy or maximum uncertainty (e.g. with a random symbol sequence shown), every event is unexpected and, therefore, new and causes surprise.

.

In information theory

The compression entropy of a message (e.g. a computer file) quantifies the information content carried by the message in terms of the best lossless compression rate.

.

In sociology

Entropy is the natural decay of structure (such as law, organisation, and convention) in a social system. This causes changes in the environment, which in turn creates the need for greater entropy in your brain (see above).

.


Next page: Business entropy


.