JUL 31, 2018

Entropy--Defining the Hard-to-Define

WRITTEN BY: Daniel Duan

Without a direct method for measurement, entropy is probably one of the most challenging concepts in physics to grasp. It is the center of the second law of thermodynamics, as it states that the total entropy, meaning the degree of disorder, of an enclosed system always increases over time.

As the first one to conceptualize the idea, 19th-century physicist Rudolf Clausius questioned the nature of the inherent loss of energy when a working body, an engine by today's definition, performs work and gave this phenomenon a mathematical expression. Entropy was later given a statistical basis by  Ludwig Boltzmann, who came up with a probabilistic way to measure the disorderliness of a collective of ideal gas particles.

Mathematician Constantin Carathéodory defined entropy mathematically as irreversibility in trajectories and integrability. Therefore, it is easy to see why entropy is the only parameters of physics is associated with the direction of a process, or the arrow of time. It ties in temporal requisite in the second law of thermodynamics.

Entropy can also be described as a measure of energy dispersal. In the context of thermodynamics, entropy is an indicator of the usefulness of a certain amount of energy. According to many cosmologists, the entropy of our universe is always increasing and its total energy is becoming less useful, which will lead to an inevitable consequence--the "heat death of the Universe".

Source: PBS Space Time via Youtube