Richard Melson

July 2006

Entropy Intro

Entropy Overview

"Ice melting" - a classic example of entropy increasing

http://en.wikipedia.org/wiki/Entropy

In chemistry, physics and thermodynamics, thermodynamic entropy, symbolized by S, is a differential term "dQ/T", where dQ is the amount of heat absorbed reversibly by a thermodynamic system at a temperature T. German physicist Rudolf Clausius introduced the mathematical concept of entropy in the early 1850s to account for the dissipation of energy in thermodynamic systems that produce work. He coined the term based on the Greek tp meaning "transformation". Although the concept of entropy is primarily a thermodynamic construct, it has given rise to ideas in many disparate fields of study, including statistical mechanics, thermal physics, information theory, psychodynamics, economics, and evolution.

Ice melting example

The illustration for this article is a classic example in which entropy increases in a small 'universe', a thermodynamic system consisting of the 'surroundings' (the warm room) and 'system' (glass, ice, cold water). In this universe, some heat energy dQ from the warmer room surrroundings (at 77 F (298 K) will spread out to the cooler system of ice and water at its constant temperature T of 32 F (273 K), the melting temperature of ice. Thus, the entropy of the system, which is dQ/T, increases by dQ/273 K. (The heat dQ for this process is the energy required to change water from the solid state to the liquid state, and is called the enthalpy of fusion, i.e. the delta H for ice fusion.)

It is important to realize that the entropy of the surrounding room decreases less than the entropy of the ice and water increases: the room temperature of 298 K is larger than 273 K and therefore the ratio, (entropy change), of dQ/298 K for the surroundings is smaller than the ratio (entropy change), of dQ/273 K for the ice+water system. This is always true in spontaneous events in a thermodynamic system and it shows the predictive importance of entropy': the final net entropy after such an event is always greater than was the initial entropy.

As the temperature of the cool water rises to that of the room and the room further cools imperceptibly, the sum of the dQ/T over the continuous range, "at many increments", in the initially cool to finally warm water can be found by calculus. The entire miniature ‘universe’, i.e. this thermodynamic system, has increased in entropy. Energy has spontaneously become more dispersed and spread out in that ‘universe’ than when the glass of ice + water was introduced and became a 'system' within it.

Overview

In a thermodynamic system, a 'universe' consisting of 'surroundings' and 'system' and made up of quantities of matter, its pressure differences, density differences, and temperature differences all tend to equalize over time. As shown in the preceding discussion of the illustration involving a warm room (surrroundings) and cold glass of ice and water (system), the difference in temperature begins to be equalized as portions of the heat energy from the warm surroundings become spread out to the cooler system of ice and water. Over time the temperature of the glass and its contents becomes equal to that of the room. The entropy of the room has decreased because some of its energy has been dispersed to the ice and water. However, as calculated in the discussion above, the entropy of the system of ice and water has increased more than the entropy of the surrounding room decreased. This is always true, the dispersal of energy from warmer to cooler always results in an increase in entropy. Thus, when the 'universe' of the room surroundings and ice and water system has reached an equilibrium of equal temperature, the entropy change from the initial state is at a maximum. The entropy of the thermodynamic system is a measure of how far the equalization has progressed.

Entropy is often described as "a measure of the disorder of a thermodynamic system" or "how mixed-up the system is". Such statements should be suspect immediately, because the terms "disorder" and "mixedupedness" are not well defined. The "disorder" of the system as a whole can be formally defined (as discussed below) in a way that is consistent with the realities of entropy, but note that such a definition will almost always lead to confusion. It is only if the word is used in this special sense that a system that is more "disordered" or more "mixed up" on a molecular scale will necessarily also be "a system with a lower amount of energy available to do work" or "a system in a macroscopically more probable state".

The entropy of a thermodynamic system can be interpreted in two distinct, but compatible, ways:

An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value. Unlike almost all other laws of physics, this associates thermodynamics with a definite arrow of time. However, for a universe of infinite size, which cannot be regarded as an isolated system, the second law does not apply.

History

The short history of entropy begins with the work of mathematician Lazare Carnot who in his 1803 work Fundamental Principles of Equilibrium and Movement postulated that in any machine the accelerations and shocks of the moving parts all represent losses of moment of activity. In other words, in any natural process there exists an inherent tendency towards the dissipation of useful energy. Building on this work, in 1824 Lazare’s son Sadi Carnot published Reflections on the Motive Power of Fire in which he set forth the view that in all heat-engines "caloric", or what is now known as heat, moves from hot to cold and that "some caloric is always lost". This lost caloric was a precursory form of entropy loss as we now know it. Though formulated in terms of caloric, rather than entropy, this was an early insight into the second law of thermodynamics. In the 1850s, Rudolf Clausius began to give this "lost caloric" a mathematical interpretation by questioning the nature of the inherent loss of heat when work is done, e.g. heat produced by friction.[1]

In 1865, Clausius gave this heat loss a name:[2]

I propose to name the quantity S the entropy of the system, after the Greek word [trope], the transformation. I have deliberately chosen the word entropy to be as similar as possible to the word energy: the two quantities to be named by these words are so closely related in physical significance that a certain similarity in their names appears to be appropriate.

Later, scientists such as Ludwig Boltzmann, Willard Gibbs, and James Clerk Maxwell gave entropy a statistical basis. Carathéodory linked entropy with a mathematical definition of irreversiblity, in terms of trajectories and integrability.

Thermodynamic definition

In the early 1850s, Rudolf Clausius began to put the concept of "energy turned to waste" on a differential footing. Essentially, he set forth the concept of the thermodynamic system and positioned the argument that in any irreversible process a small amount of heat energy dQ is incrementally dissipated across the system boundary.

Specifically, in 1850 Clausius published his first memoir in which he presented a verbal argument as to why Carnot’s theorem, proposing the equivalence of heat and work, i.e. Q = W, was not perfectly correct and as such it would need amendment. In 1854, Clausius states: "In my memoir ‘On the Moving Force of Heat, &c.’, I have shown that the theorem of the equivalence of heat and work, and Carnot’s theorem, are not mutually exclusive, by that, by a small modification of the latter, which does not affect its principle, they can be brought into accordance." This small modification on the latter is what developed into the second law of thermodynamics.

In his 1854 memoir, Clausius first develops the concepts of interior work, i.e. "those which the atoms of the body exert upon each other", and exterior work, i.e. "those which arise from foreign influences which the body may be exposed", which may act on a working body of fluid or gas, typically functioning to work a piston. He then discusses the three types of heat by which Q may be divided:

  1. heat employed in increasing the heat actually existing in the body

  2. heat employed in producing the interior work

  3. heat employed in producing the exterior work

Building on this logic, and following a mathematical presentation of the first fundamental theorem, Clausius then presents us with the first-ever mathematical formulation of entropy, although at this point in the development of his theories calls it "equivalence-value". He states, "the second fundamental theorem in the mechanical theory of heat may thus be enunciated:"[3]

If two transformations which, without necessitating any other permanent change, can mutually replace one another, be called equivalent, then the generations of the quantity of heat Q of the temperature t from work, has the equivalence-value:

and the passage of the quantity of heat Q from the temperature t1 to the temperature t2, has the equivalence-value:

wherein T is a function of the temperature, independent of the nature of the process by which the transformation is effected.

This is the first-ever mathematical formulation of entropy; at this point, however, Clausius had not yet affixed the concept with the label entropy as we currently know it; this would come in the following two years.

In 1876, chemical engineer Willard Gibbs, building on the work of those as Clausius and Hermann von Helmholtz, situated the view that the measurement of “available energy” delta G in a thermodynamic system could be mathematically accounted for by subtracting the “energy loss” T S from total energy change of the system H. These concepts were further developed by James Clerk Maxwell [1871] and Max Planck [1903].

Units and symbols:

Conjugate variables
of thermodynamics

Pressure Volume

Temperature Entropy

Chem. potential Particle no.

Entropy is a key physical variable in describing a thermodynamic system. The SI unit of entropy is 'joule per kelvin' (J·K-1), which is the same as the unit of heat capacity, and entropy is said to be thermodynamically conjugate to temperature. The entropy depends only on the current state of the system, not its detailed previous history, and so it is a state function of the parameters like pressure, temperature, etc., which describe the observable macroscopic properties of the system. Entropy is usually symbolized by the letter S.

There is an important connection between entropy and the amount of internal energy in the system which is not available to perform work. In any process where the system gives up an energy delta E, and its entropy falls by delta S, a quantity at least TR S of that energy must be given up to the system's surroundings as unusable heat. Otherwise the process will not go forward. (TR is the temperature of the system's external surroundings, which may not be the same as the system's current temperature T ).

Statistical interpretation

In 1877, thermodynamicist Ludwig Boltzmann visualized a probabilistic way to measure the entropy of an ensemble of ideal gas particles, in which he defined entropy to be proportional to the logarithm of the number of microstates such a gas could occupy.

Henceforth, the essential problem in statistical thermodynamics, i.e. according to Erwin Schrodinger, has been to determine the distribution of a given amount of energy E over N identical systems.

Statistical mechanics explains entropy as the amount of uncertainty (or "mixedupness" in the phrase of Gibbs) which remains about a system, after its observable macroscopic properties have been taken into account. For a given set of macroscopic quantities, like temperature and volume, the entropy measures the degree to which the probability of the system is spread out over different possible quantum states. The more states available to the system with higher probability, and thus the greater the entropy.

On the molecular scale, the two definitions match up because adding heat to a system, which increases its classical thermodynamic entropy, also increases the system's thermal fluctuations, so giving an increased lack of information about the exact microscopic state of the system, i.e. an increased statistical mechanical entropy.

The entropy is dominated by the different arrangements possible on a molecular scale. There is entropy associated with macroscopic order (eg a shuffled pack of cards vs the messy distribution of objects in a room), but it is negligible, because the number of macroscopic objects is tiny compared to the number of molecules. The entropy produced by the heat in your muscles while shuffling an ordered pack of cards is not negligible, because it is molecular in scale, while the entropy involved in creating a mess of cards is completely negligibile.

Information theory

The concept of entropy in information theory describes with how much randomness (or, alternatively, 'uncertainty') there is in a signal or random event. An alternative way to look at this is to talk about how much information is carried by the signal.

The entropy in statistical mechanics can be considered to be a specific application of Shannon entropy, according to a viewpoint known as MaxEnt thermodynamics. Roughly speaking, Shannon entropy is proportional to the minimum number of yes/no questions you have to ask to get the answer to some question.

The statistical mechanical entropy is then proportional to the minimum number of yes/no questions you have to ask in order to determine the microstate, given that you know the macrostate.

The second law

An important law of physics, the second law of thermodynamics, states that the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value; and so, by implication, the entropy of the universe (i.e. the system and its surroundings), assumed as an isolated system, tends to increase. We will consider the meaning of the "second law" further in a subsequent section. Two important consequences are that heat cannot of itself pass from a colder to a hotter body: i.e., it is impossible to transfer heat from a cold to a hot reservoir without at the same time converting a certain amount of work to heat. It is also impossible for any device that operates on a cycle to receive heat from a single reservoir and produce a net amount of work; it can only get useful work out of the heat if heat is at the same time transferred from a hot to a cold reservoir. This means that there is no possibility of a 'perpetuum mobile' which is isolated. Also, from this it follows, that a reduction in the increase of entropy in a specified process, such as a chemical reaction, means that it is energetically more efficient.

The arrow of time

Entropy is the only quantity in the physical sciences that "picks" a particular direction for time, sometimes called an arrow of time. As we go "forward" in time, the Second Law of Thermodynamics tells us that the entropy of an isolated system can only increase or remain the same; it cannot decrease. Hence, from one perspective, entropy measurement is thought of as a kind of clock.

Entropy and cosmology

We have previously mentioned that a finite universe may be considered an isolated system. As such, it may be subject to the Second Law of Thermodynamics, so that its total entropy is constantly increasing. It has been speculated that the universe is fated to a heat death in which all the energy ends up as a homogeneous distribution of thermal energy, so that no more work can be extracted from any source.

If the universe can be considered to have generally increasing entropy, then - as Roger Penrose has pointed out - an important role in the increase is played by gravity, which causes dispersed matter to accumulate into stars, which collapse eventually into black holes. Jacob Bekenstein and Stephen Hawking have shown that black holes have the maximum possible entropy of any object of equal size. This makes them likely end points of all entropy-increasing processes, if they are totally effective matter and energy traps. Hawking has, however, recently changed his stance on this aspect.

The role of entropy in cosmology remains a controversial subject. Recent work has cast extensive doubt on the heat death hypothesis and the applicability of any simple thermodynamic model to the universe in general. Although entropy does increase in the model of an expanding universe, the maximum possible entropy rises much more rapidly and leads to an "entropy gap," thus pushing the system further away from equilibrium with each time increment. Complicating factors, such as the energy density of the vacuum and macroscopic quantum effects, are difficult to reconcile with thermodynamical models, making any predictions of large-scale thermodynamics extremely difficult.

Entropy in fiction

See also:

Logarithmic units

Maxwell's demon

Negentropy

Residual entropy

Statistical Mechanics

Syntropy

Thermodynamic potential

References

  1. Clausius, Ruldolf (1850). On the Motive Power of Heat, and on the Laws which can be deduced from it for the Theory of Heat. Poggendorff's Annalen der Physick, LXXIX (Dover Reprint). ISBN 0486590658.

  2. Laidler, Keith J. (1995). The Physical World of Chemistry. Oxford University Press. ISBN 0198559194.

  3. Published in Poggendoff’s Annalen, Dec. 1854, vol. xciii. p. 481; translated in the Journal de Mathematiques, vol. xx. Paris, 1855, and in the Philosophical Magazine, August 1856, s. 4. vol. xii, p. 81

Further reading

  1. Fermi, Enrico (1937). Thermodynamics. Prentice Hall. ISBN 048660361X.

  2. Kroemer, Herbert; Charles Kittel (1980). Thermal Physics, 2nd Ed., W. H. Freeman Company. ISBN 0716710889.

  3. Penrose, Roger (2005). The Road to Reality : A Complete Guide to the Laws of the Universe. ISBN 0679454438.

  4. Reif, F. (1965). Fundamentals of statistical and thermal physics. McGraw-Hill. ISBN 0070518009.

  5. Goldstein, Martin; Inge, F (1993). The Refrigerator and the Universe. Harvard University Press. ISBN 0674753259.

External links

Retrieved from "http://en.wikipedia.org/wiki/Entropy"

Entropy History & Overview

July 14, 2006