Science Law of Entropy Thermodynamics

The law of entropy, or the second law of thermodynamics, holds that in a closed system, entropy always increases (or remains constant, but increases in the end). The first law of thermodynamics runs that energy can be neither created nor destroyed, merely transformed from one state to another. This first law informs scientists’ understanding of the second.

But what is entropy?

Entropy has become associated in popular vocabulary with chaos, decay and disorder, and it is easy to see how this has happened, but as entropylaw.com demonstrates, it is a much more specific scientific term than that. In any transfer of energy from potential to kinetic, some energy is lost in excess heat.

Imagine a boulder on top of a hill. The boulder’s position, high above the ground, means that it has a huge store of potential energy, which is transferred into kinetic energy (ie, movement) when it is pushed so that it rolls down the hill. But not all the boulder’s potential energy is transferred into kinetic energy. With every bump against the ground, the boulder and the hill generate friction between them which is radiated in the form of heat energy.

Where does this heat go? In a closed system, the theory runs that the heat, probably a minute amount but still detectable, dissipates into the atmosphere. But the point is that the energy is not lost just because the heat dissipates, instead the temperature of the surrounding air particles is raised ever so slightly. The heated air particles flow outwards, heating further air particles, transferring heat energy until the surrounding particles are the same temperature. Once the temperature differential between has been evened out, temperature flow ceases, but the air is very slightly warmer than it was before.

To take another example, a glass of boiling water (at 100°C) placed in a room at 30°C, will quickly lose its temperature as the heat in the water transfers to the cooler air in the room. But this energy transfer will stop as soon as the temperature of the water matches the room’s air temperature (perhaps 30.00001°C).

This dissipated energy is known as entropy, a term coined by physicist Rudolf Clausius as far back as 1865 as a measure of the energy in a thermodynamic system not available to do useful work.

These two laws of thermodynamics combined to tell scientists two important things. First, perpetual motion (a machine generating energy continuously without external energy input) is impossible. Second, our universe, defined as an expanding but essentially closed system, will end in what physicists like to call ‘heat death’.

The first point seems conclusion seems inevitable. If a process loses energy with each iteration, then clearly sooner or later any machine is going to run out of energy without any additional stimulus. Batteries lose their charge, power plants require more fossil fuels, even orbits decay.

The eventual heat death of the Universe is a bigger concept to grasp, and so it is often questioned. If the Universe is finite, however, sooner or later the massive amounts of heat radiating from every star, and from every planet, will ‘average out’ the temperature of the whole of creation no more heat exchange will be possible. Obviously this process will take many more billions of years than the human mind can consciously process.

Enquiring students often raise the question of refrigeration at this point, to query the inevitability of the increase of entropy. After all, domestic fridges and freezers have the ability to cool the air far beyond the infinitesimal fractions of a degree that a rolling boulder can heat air particles, don’t they? But of course, the proportion of electrical energy is channeled towards cooling a person’s cheese does not take into account the energy lost as heat from the back of a refrigerator, or lost as the sound of the motor, or even the little light that comes on when the door is opened.

Entropy increases.