#system #entropy

idea

In information theory, entropy is a measure of signal vs. noise. Pure noise has an entropy of 1. Pure signal has an entropy of 0.

The world is in a permanent state of decay. Things perpetually move into a more disordered state, a deconstructed state, an equilibrium. This is the second law of thermodynamics[1]: the entropy of an isolated system naturally increases over time. Heat evens, sound dissipates, sugar melts, constructions break, socks disappear. The universe moves into a more stable, disorganized, noisy state. An equilibrium of maximum entropy.

Left to their own agenda, the state of things tends to move to the middle. Anything that can get worse will get worse. It’s not cynicism, it’s physics. These are just manifestations of entropy.

To move anything away from an equilibrium energy must be brought into the system.

Fighting against this natural decaying transformation requires external energy to be brought into the system to move away from the state of equilibrium.

Variability can be exploited to decrease entropy, by generating information in return of flexibility[2].

links

Weight management system is a struggle because of entropy.

Continuous improvement (#7) is the only scalable way to fight entropy.

Knowledge management requires organization to maximize noise.

[[Seasonal themes]] are a way to leverage variability (goals) to accomplish goals (broad goals as themes)

Getting Things Done is a way to systemically reduce entropy at a personal level.

Natural Selection is a process that uses entropy to adapts organisms to a system.

references

Location: 1498

[2]: The Principles of Product Development Flow

Fifty percent failure rate is usually optimum for generating information. [...] An event with low probability has high information content.

~ The Principles of Product Development Flow: Second Generation Lean Product Development, Donald G Reinertsen

In other words, trying something which you don't know will work at all is the best way to learn something new.

Veritasium / entropy offers a good explanation of the thermodynamics concept.