What is an entropy 1



The title of this article is ambiguous. Further meanings are listed under Entropy (disambiguation).

The entropy (Greek made-up word εντροπία [entropía], from εν ~ [en ~] - a ~, in ~ and τροπή [tropí] - phrase, transformation) is an extensive state variable in thermodynamics. Each state of equilibrium of a thermodynamic system can be clearly assigned an entropy value. In statistical physics, entropy represents a measure of the phase space volume that can be reached by the system. In the context of classical thermodynamics, entropy embodies a measure of adiabatic reachability.

Basics

Entropy is a thermodynamic quantity with which heat transfers and irreversible processes in thermodynamic processes can be computationally recorded and clearly represented (see T-s diagram). The entropy S. (Unit J / K) is an extensive quantity of state such as volume, electrical charge or the amount of substance. If you divide by the mass, you get the specific entropys with the unit J / (kg · K) as the intensive state variable. The German physicist Rudolf Clausius introduced the term he had invented in 1865 to describe cyclic processes.

The differential dS is, according to Clausius, in reversible processes the ratio of transferred heat δQ and absolute temperatureT:

This change in entropy is positive when heat is supplied and negative when heat is removed. Clausius also described the increase in entropy without heat transfer through irreversible processes in an isolated system with the inequality:

Today a distinction is made between the transported entropy (1st equation) and the produced entropy (2nd equation) and - generally valid, i.e. also for the non-adiabatic system - both equations are summarized:

Where δW.diss the work dissipated within the system, which is always positive, i.e. is supplied to the system (e.g. friction work). The 3rd equation is a form of the second law of thermodynamics. The differential dS. However, it only exists for quasi-static changes in state, i.e. This means that a sequence of states can be recorded using measurement technology (states with only minor imbalances). This would not be the case with the process shown in the picture in the adiabatic system, in which only the initial state and the final state can be specified. For an ideal gas, however, the entropy difference can easily be calculated using a reversible isothermal replacement process (release of work and absorption of heat from the environment), as described in section Examples happened.

History of the term "entropy"

Besides energy, entropy is the most important term in thermodynamics and it is helpful to go to the starting point of this science for a better understanding and to recapitulate the development. In 1712, Thomas Newcomen's first steam engine was installed in a mine to pump water. The machine did its job justice, but needed a lot of fuel. At this point in time, the connection between energy and heat was completely unclear and it would be more than 130 years before Julius Mayer published the first law of thermodynamics. From 1764 James Watt improved the steam engine and was able to more than double its efficiency to over 1% without knowing the underlying theory. It was not until 60 years later that the young French engineer Sadi Carnot had the decisive idea, which he published in 1824. Inspired by his father's work on water mills, Carnot described a steam engine through a cyclical process in which heat flows from a hot spring to a cold sink, doing work in the process. The ratio of mechanical work extracted ΔW to introduced heat ΔQ was the efficiency:

In his original writing, Carnot was of the opinion that heat is a kind of imponderable substance that always flows from a hot to a cooler body, similar to the way water always moves downhill. And just like falling water, the higher the gradient, the more work the heat could do, in particular the machine could not do more work than the heat supplied. Carnot later corrected himself and recognized the equivalence of heat and energy a decade before Mayer, Joule and Thomson. So he was way ahead of his time, unfortunately he died young and his work went unnoticed at first. It was Clausius who first formulated the connection between the temperature difference - the source and the sink - with the efficiency of the Carnot machine and that this efficiency cannot be exceeded by another heat engine, since otherwise heat would flow spontaneously from a cold to a hot body. The impossibility of such a process in nature is known today as the 2nd law of thermodynamics. Clausius formulated it with a circular process:

There is no cyclically working machine whose only one Effect is heat transport from a cooler to a warmer reservoir.

To put it more simply, the 2nd law says that temperature differences are not in nature spontaneous can enlarge. With this requirement, Clausius could complete the sentence

derive for any cycle. The equals sign only applies to reversible processes. With this sentence from Clausius it is obvious the size

to be defined differentially. Clausius called this size entropy and over time it became common to formulate the 2nd law directly with the entropy, which in no way leads to a deeper understanding. Only decades later was Ludwig Boltzmann able to find an explanation for the entropy as a measure of the achievable micro-states of the system with his statistical mechanics. Heat is energy randomly distributed over atoms and molecules and flows from hot to cold because the reverse path is simply too improbable.

In 1999 the theoretical physicists Elliott Lieb and Jakob Yngvason put the definition of entropy in phenomenological thermodynamics on a strictly axiomatic basis. This definition does not make use of quantities such as “heat” and “temperature”, which cannot be precisely defined without entropy, but is based on the concept of adiabatic accessibility.

Assistance to understanding

Contrary to popular saying, energy is not consumed in the physical sense, but only converted, e.g. B. in machines (1st law of thermodynamics - conservation of energy). In the course of a cycle, a gasoline engine is supplied with the same amount of energy in the form of fuel as is dissipated as drive work and heat. Since the drive work is finally converted into heat by friction, the entire energy content of the fuel ends up in the environment as a quantity of heat, apart from any components that may have been converted into potential energy or mechanically into deformation energy. So the energy was not consumed, just converted. So you needed a size to get the Ability to work to describe the energy, since the amount of energy alone does not say anything about the ability to work. In theory, the world's oceans contain a huge amount of energy. However, since this is at ambient temperature, no work can be done with it.

Clausius now found out that the higher the temperature at which it is fed to the machine, the more work can be done with an amount of energy (see Carnot efficiency). Using the example of the engine, the fuel energy is fed to the engine through combustion at approx. 2000-2500 ° C and leaves it again at approx. 50 ° C through the radiator and via the wheels. With the help of Clausius' equations, it is now possible to predict exactly how much work the engine could produce at its maximum. The energy supplied has a low entropy, while the waste heat has a high entropy. The possible work output can be calculated from the difference.

Problems of the term entropy

In popular science books, but also in many textbooks, entropy is equated with disorder. This analogy holds true for some systems, e.g. B. an ordered crystal has a much lower entropy than its melt. This consideration is rather problematic for other systems, e.g. For example, an ordered biomembrane in water has a higher entropy than its disordered components dissolved in water (see examples below). The problem is primarily that of the slang term disarray is not clearly defined and the entropy is not a measure for the symmetry of the system, but for the number of microscopically achievable states, independently of their degree of order, however defined. The term disarray therefore avoided.

Confusion also arises from the fact that the term entropy is used in different disciplines with reference to different phenomena. The discovery of entropy in the context of thermodynamics and its central role in this theory did not limit its transfer to other areas such as e. B. Information theory. Entropy is a statistically defined quantity and can be used meaningfully in many contexts. Regardless of this, the definitions in the individual disciplines can be inconsistent or even contradictory. Norbert Wiener used the term entropy to describe information phenomena just like Claude Elwood Shannon, albeit with a negative sign. The fact that Shannon's definition has prevailed is mainly due to the better technical usability of his work. However, it becomes clear from this example that in an interdisciplinary application of the concept of entropy, at least caution and a precise source analysis are required.

Entropy is not a directly measurable statistical quantity, such as B. the temperature and the pressure. It can only Changes entropy, and it is also not a strict conservation quantity like energy, mass, particle number or charge of a system. This is also an essential difference between the first and second law of thermodynamics. While the first law is nothing else than the formulation of the strictly valid law of conservation of energy in the language of thermodynamics, the second law represents just a statistically justified claim. However, the probability of a violation of the second law in macroscopic systems is extremely low. However, it cannot be deduced directly from the microscopic equations and was even refuted by Poincaré in the context of classical mechanics. All of these properties create problems in understanding the concept of entropy.

Entropy in Thermodynamics

An ideal process that can be reversed at any time without friction losses is also called reversible. Often the entropy remains unchanged during a process, ΔS. = 0, a well-known example is the adiabatic compression and expansion in the cycle of a Carnot machine. Changes of state with constant entropy are also called isentropic, but not all isentropic changes of state are adiabatic. Is a process adiabatic and reversible, however, it always follows that it is also isentropic.

Is in a cycle at the temperature TH the heat QH recorded and the amount of heat Ql at Tl released again, it is true that the entropy changes:

The maximum energy output can be derived from this A. = QHQl and derive the maximum efficiency.

Similar to the way that the temperature indicates the statistically averaged energy of the particles in a many-body system, Boltzmann was able to show that entropy can also be recorded statistically, namely as a function of the number of occupable states in a many-body system:

kB. is the Boltzmann constant, ln is the natural logarithm and Ω is the number of states that the particles of a system can assume. The choice of the base of the logarithm is not critical, it only changes the constant factor.

 

The picture on the right shows the mixture of a brown color in water. At the beginning the color is unevenly distributed. After a long wait, the water will take on an even color.

Entropy is a measure of Ignorance. As a measure of disarray you have to pay close attention to the terminology. In the picture example, the liquid in the glass on the right is mixed "more properly", but there is a greater degree of disorder there due to the large amount of water and color particles being mixed together. Therefore the entropy is higher there than in the left glass. We know about the color that it is distributed throughout the water in the right glass. The picture on the left tells us more. We can identify areas where there is a high concentration of color or areas that are devoid of color.

The entropy of mixing can be calculated. Josiah Willard Gibbs pointed out the contradiction that the increase in entropy should also occur if water is poured into the water glass instead of the ink (Gibbs' paradox).

The number of arrangements of the color molecules at the beginning is significantly less than when the color can be distributed throughout the entire volume. Because the color molecules are only concentrated in a few areas. In the right picture you can stay in the entire glass. The entropy is greater here, which is why the system tends to achieve this uniform distribution over time.

The entropy only remains unchanged if the processes are reversible. Real changes of state are always associated with energy losses (e.g. due to friction), which increases entropy. It is not possible to reduce the total entropy in a closed system. But the entropy can be reduced locally if it increases accordingly in other places in the system.

Second and third law

Rudolf Julius Emanuel Clausius had recognized that through

differentially given variable represents an extensive state variable, i.e. it is independent of the reaction path and proportional to the system size. The designation δQreversible instead of dQ emphasizes that the change in heat is path-dependent (for example, see cycle) and is therefore not a complete differential.

Clausius also found that the entropy increases monotonically in an isolated system:

He formulated this observation in the 2nd law of thermodynamics as a negation of the existence of a perpetual motion machine of the second kind:

"There is no such thing as a cycle whose only effect is to transport heat from a colder reservoir to a warmer reservoir."

Apparently an inexhaustible source of energy would otherwise have been constructed. Equivalent to this is the formulation of William Thomson:

"There is no such thing as a cycle that takes a quantity of heat from a reservoir and completely converts it into work."

In contrast to the already known extensive quantities of thermodynamic systems, such as energy E., Volume V. and mass m, the entropy initially eluded a deeper understanding. Entropy could only be satisfactorily explained by Ludwig Boltzmann in the context of statistical mechanics as a measure of the phase space volume, that of the phase trajectory of the system while maintaining the constancy of selected macroscopic observables such as temperature T, Volume V. or particle number N, can be reached.

The entropy is therefore a measure for missing information about the actual microstate if only a small number of observable quantities are available for characterizing the macrostate. The ergodic hypothesis claims that the trajectory of the system actually covers the entire phase volume measured by the entropy over time. Systems that show this behavior are also called ergodic. The 2nd law can only be used sensibly with these. Closely related to this is the irreversibility of processes in nature.

The third law (the so-called "Nernst heat law") defines the entropy of a perfectly crystalline substance at absolute zero as zero:

S.(0) = 0

One conclusion is, for example, that the heat capacity of a system disappears at low temperatures.

Examples

 

Expansion attempt by Gay-Lussac: In the introduction the experiment by Gay-Lussac is described. How big is the change in entropy in the experiment described? Since the entropy is a state variable, it is path-independent. Instead of pulling out the partition, you can slowly slide it to the right until the final volume is reached. For an infinitesimal shift, the volume increases by dV., the entropy increases by dS. = ΔQ / T. From the first main clause dU = ΔQ + ΔW. follows with dU = 0 and ΔW. = − pdV., since only volume work is performed:

From the ideal gas law (N is the number of gas atoms)

follows:

.

This immediately results from integration:

.

As in the example above N = 47 atoms are drawn:

.

More realistic would be e.g. B. 1 mol of atoms, so Atoms, with which

results.

Numerical example:

In a system that exchanges neither mass nor energy with its surroundings, the entropy can never decrease spontaneously. Example: One kilogram of water has entropy at 10 ° C , at 20 ° C , at 30 ° C . 1 kg of cold water (10 ° C) and 1 kg of warm water (30 ° C) can spontaneously change into the state of 2 kg of lukewarm water (20 ° C) when touched, because the entropy of the initial state (151 + 437 = 588) is smaller is than the entropy of the final state (297 + 297 = 594). The spontaneous reversal of this process is not possible because the entropy of the system consisting of 2 kg of water would have to decrease from 594 J / K to 588 J / K, which would contradict the second law of thermodynamics.

Biomembranes: If you put lipids, i.e. the building blocks of the biomembrane, in water, closed membrane structures, so-called vesicles, spontaneously form. Since temperature and pressure are given here (heat bath and pressure ensemble), the thermodynamic potential that strives for a minimum is the free enthalpy ΔG = ΔHTΔS.. The enthalpyΔH can be measured experimentally and is positive. Since the process takes place spontaneously, however, ΔG be negative, d. H. the entropy must increase. At first glance, this is confusing, as entropy is mostly responsible for the fact that substances mix (entropy of mixing). The increase in entropy is due to a special property of water. It forms hydrogen bonds between the individual water molecules, which fluctuate constantly and thus make a high contribution to the entropy of the water. When the lipids are dissolved in water, a larger area is created around the long fatty acid chains in which hydrogen bonds can no longer be formed. In the areas around the fatty acid chains, the entropy contribution of the hydrogen bonds is missing, so that the entropy decreases overall. This decrease is significantly greater than the increase expected from mere mixing of the water and lipid. When the fatty acid chains stick together, more hydrogen bonds can be formed and the entropy increases. Another way of phrasing this is that water's ability to form fluctuating hydrogen bonds drives lipids out of solution. Ultimately, this property is also responsible for the poor solubility of many non-polar substances that interfere with the formation of hydrogen bonds.

Living organisms: A living organism can, in a sense, be viewed as a thermodynamic machine that converts chemical energy into work and heat while producing entropy. However, according to the current state of research, it is not yet clear whether an entropy can be assigned to a biological system, since it is not in a state of thermodynamic equilibrium.

Other disciplines: In addition to its role as a fundamental state variable in phenomenological and statistical thermodynamics, entropy is used in other areas, in particular in information theory and economics. Entropy has an independent meaning in these areas. So it is z. In astrophysics, for example, it is necessary to use the term entropy when describing star births, white dwarfs, neutron stars, black holes (they have the highest entropy of all known physical systems), globular clusters, galaxies (clusters) and ultimately the entire cosmos.

Statistical Physics

Around 1880 Ludwig Boltzmann was able to explain entropy on a microscopic level with the statistical physics founded by him and James Maxwell. In statistical mechanics, the behavior of macroscopic thermodynamic systems is explained by the microscopic behavior of its components, i.e. elementary particles and systems composed of them such as atoms and molecules. A microstate is classically given by specifying all locations and impulses of the particles belonging to the system. Such a microstate is therefore an element of a 6N-dimensional vector space, which in this context is called phase space. The canonical equations of classical mechanics describe the evolution of the system over time, the phase trajectory. All under given macroscopic boundary conditions, such as B. total energy E, volume V and particle number N, achievable phase points form a coherent phase space volume Ω. The entropy is calculated in the SI system

in the unit J / K. In order to be able to actually carry out this calculation, the macroscopic observables of the system under consideration must first be known. The constant kB. is called the Boltzmann constant in recognition of Ludwig Boltzmann's achievements in the development of statistical theory, but he himself did not determine its value.

Quantum mechanics

In quantum statistics, a microstate is given by a vector in the Hilbert space.

This pure state contains all the information about the system that is provided by a ideal measurement are accessible. A Macrostate is classically given by an ensemble of micro-states that have certain conserved quantities in common, such as B. Energy, volume and number of particles. The distribution of the micro-states in the phase space is given by a distribution function , which is replaced by the density operator in the quantum mechanical description

The expectation of an observable on the one described by the density operator mixed condition is given by

The entropy is given by the probabilities of the individual microstates in the macrostate

in which pi the probability is im i-th microstate (see Stirling formula for deriving this relation). kB. is the Boltzmann constant.

As an example we take a spin system with 4 electrons. The total energy should be - 2μB. be.

It follows that Ω = 4

The general formula is identical to the formula for the information entropy except for a constant factor. This means that the physical entropy is also a measure of the information that is missing from knowledge of the macrostate on the microstate.

The 2nd law of thermodynamics becomes a probability statement in statistical mechanics: It is purely theoretically possible, for example, that heat flows from the colder body to the warmer one, but it is so unlikely that it will happen even at a time that is millions of times the age of the universe corresponds with a probability bordering on certainty will not happen.

With real systems and normal temperatures, individual states can no longer be counted. Instead of the number of states, there is the achievable volume in the multidimensional phase space.

Properties of the statistical entropy of a quantum mechanical state

Be and Density operators on the Hilbert space .

  • Invariance under unitary transformations of (With )
The minimum becomes in pure states accepted
Maximum is assumed if all possible state vectors have the same probability occur
With
Be Density operator on and or. reduced density operators or.

Quotes

  • “The overwhelming striving for disorder does not mean that ordered structures such as stars and planets or ordered forms of life such as plants and animals cannot form. You can. And they obviously do. The second law of thermodynamics says that when order is produced, disorder is more than equally produced. The entropy balance is still in the profit zone, even if some components take on a higher degree of order. "(Brian Greene: The stuff that the cosmos is made of. Siedler, Munich 2004, ISBN 3-88680-738-X, P. 204f)
  • "This term is generally unpopular and is considered difficult, perhaps because it is a balance sheet but not a conservation factor and even has the unusual property of increasing, the more the less you pay attention." (Norbert Treitz: Bridge to physics, German publishing house, Frankfurt 2003, ISBN 3-8171-1681-0, Chapter 6.3)
  • “When you stir your rice pudding, Septimus, the jam will spread around and make red traces like in the picture of a meteor in my astronomical atlas. But if you stir backwards, the jam won't come together anymore. In fact, the pudding doesn't notice it and continues to be pink as before ”(Tom Stoppard: Arcadia, act 1, scene 1, dialogue between Thomasina and Septimus. In this play, which premiered in 1993, Tom Stoppard addresses entropy in various places.)
  • "In nature, entropy takes on the role of director, but energy only that of an accountant." (Arnold Sommerfeld)

literature

Textbooks

  • G. Adam, O. Hittmair: Heat theory. 4th edition. Vieweg, 1992, ISBN 3-528-33311-1.
  • Richard Becker: Theory of heat. 3rd, supplementary edition. Springer, 1985, ISBN 3-540-15383-7.
  • Johan Diedrich Fast: Entropy. Huethig, 1982, ISBN 3-87145-299-8.
  • Arnold Sommerfeld: Lectures on theoretical physics - thermodynamics and statistics. Reprint of the 2nd edition. Harri Deutsch, 1988, ISBN 3-87144-378-6.
  • André Thess: The entropy principle - thermodynamics for the dissatisfied, Oldenbourg Wissenschaftsverlag, 2007, ISBN 978-3-486-58428-8.

Popular science presentations

  • Arieh Ben-Naim: Entropy Demystified - The Second Law Reduced to Plain Common Sense. 2007, ISBN 981-270-055-2.
  • H. Dieter Zeh: Entropy. Fischer, 2005, ISBN 3-596-16127-4.
  • Jeremy Rifkin, Ted Howard: Entropy: A New World View. Viking Press, New York 1980 (German: Entropy: a new worldview. Hamburg, Hofmann & Campe, 1984).
  • V. J. Becker: God's Secret Thoughts: A Philosophical Digression to the Limits of Science and Mind. Books on Demand GmbH, Norderstedt, Germany 2006, ISBN 3-8334-4805-9.

Category: Thermodynamics