The entropy produced by the heat in your muscles during the shuffling process would be very roughly 10 20 times larger. So the entropy associated with the disordered sequence is about 3 × 10 −21 JK −1. If you shuffle an initially ordered pack of cards, you go from just one sequence, to any of possibly 52*51*50*.3*2*1 = 52! possible sequences. To show you what I mean, let's be quantitative. I can't blame the second law for the disorder in my office. Macroscopic disorder makes negligible contributions to entropy. For one gram of ice at 273 K (0° C), this gives an entropy of melting of 1.2 JK −1.Īn important point: this is disorder at the molecular level. The entropy produced by equilibrium melting is just the heat required to melt a solid divided by the melting temperature. For example, when a solid melts, it goes from an ordered state (each molecule in a particular place) to a state in which the molecules move, so the number of possible permutations of molecular positions increases enormously. In practice, entropy is often related to molecular disorder. Macroscopic disorder has negligible entropy (Or we could set the gas constant R = 1: less elegant but it would give a more convenient scale for measuring temperature.) More on that below. We could logically have a system of units in which k B = 1. Rather, k B = 1.38 × 10 −23 JK −1 is our way of saying that the conversion factor between our units of joules and kelvins is 1.38 × 10 −23 JK −1. giving a value of G does tell us about gravity. We often write that Boltzmann's constant is k B = 1.38 × 10 −23 JK −1, but this is not a statement about the universe in the same way that e.g. Now his tombstone has the final say and he is immortalised in k B and the Boltzmann equation. Famously, this equation takes pride of place on his tombstone.īoltzmann's ideas were inadequately recognised in his lifetime, and he took his own life. S = k B ln W implies that entropy quantifies disorder at the molecular level. On first encounter, it seems surprising - and wonderful - that the two definitions are equivalent, given the very different ideas and language involved. Boltzmann's microscopic definition of entropy is S = k B ln W, where k Bis Boltzmann's constant and W is the number of different possible configurations of a system. Statistical mechanics applies Newton's laws and quantum mechanics to molecular interactions. The increase of entropy in a closed system gives time a direction: S 2 > S 1 ⇔ t 2 > t 1.Ī molecular interpretation comes from statistical mechanics, the meta-theory to thermodynamics. Entropy can decrease, and does so in a refrigerator when you turn it on, but that is not a closed system: the fridge motor compresses the refrigerant gas, raising its temperature enough to allow heat to flow out, a process which exports more entropy into the kitchen than that lost by the cooling interior the heat is not 'of itself' passing from cold (inside) to hotter. The Second Law of Thermodynamics says that entropy cannot decrease in a closed system, which is also correctly expressed by comic songsters Flanders and Swann as 'heat cannot of itself pass from one body to a hotter body'. Originally, entropy had no specific relation to order and it had units of energy over temperature, such as J.K −1. This definition comes from thermodynamics, a classical, macroscopic theory, part of whose great power comes from the fact that it is developed without specifying the molecular nature of heat and temperature. But the original definition of entropy is macroscopic, it is the heat transferred in a reversible process divided by the temperature at which the transfer occurs. Do life or evolution violate the second law of thermodynamics?Īt the molecular level, entropy is related to disorder.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |