1 2
14 15 16
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
jimsfriend reminded me that there are other free energies that differ about the contraints applied to the system, i.e. allow heat to pass or not, constant pressure, etc. So, yeah, his post wasn't precise enough to understand what should be the right term. By the way, I'm used to the following definition: the difference of entropy between states 1 and 2 is the integral of dQ/T over any reversible process that starts from state 1 and ends at state 2. Of course, it has close to zero physical meaning, but my guess is that if you plug some formula from statistical mechanics at the logarithm of probabilities definition, you might be able to deduce it's the same integral.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
p4wn3r wrote:
By the way, I'm used to the following definition: the difference of entropy between states 1 and 2 is the integral of dQ/T over any reversible process that starts from state 1 and ends at state 2.
The problem with mathematically accurate definitions is that they are very hard for laymen to understand. That's why more informal and understandable definitions are necessary, even if they might not be fully accurate or might lead to misunderstandings. When explaining General Relativity to a layman, you could give him the Einstein field equations, or when explaining quantum mechanics you could lay out the Maxwell equations, but that would be equivalent to explaining it in klingon. More colloquial approximations (if not even metaphors) are necessary to give even a slight idea of what's going on. It will be inaccurate, of course, but the alternative is not explaining it at all (at least not in a language that normal people understand).
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
I totally agree with you. I only mentioned it to give a possible link between the definitions from thermodynamics and statistical mechanics, I really have no idea how they could relate to each other without using mathematics. And I think you mean Schrodinger's equation for QM, Maxwell's equations are still in classical physics.
Joined: 5/30/2007
Posts: 324
Warp wrote:
The problem with mathematically accurate definitions is that they are very hard for laymen to understand. That's why more informal and understandable definitions are necessary, even if they might not be fully accurate or might lead to misunderstandings.
Very true, but you also don't want to give them a conceptual definition that is plain wrong, like you did.
Warp wrote:
or when explaining quantum mechanics you could lay out the Maxwell equations,
Huh? What do Maxwell equations have to do with quantum mech? Maxwell's equations are just classic E&M. You're probably thinking of Schrödinger's equation and/or de Broglie wavelengths. Edit- p4wn3r beat me to it...
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
IronSlayer wrote:
Very true, but you also don't want to give them a conceptual definition that is plain wrong, like you did.
I get the impression (not only from this thread but also from other threads) that you are trying to troll me, for whatever reason. Cut it out, will you?
Joined: 5/30/2007
Posts: 324
Warp wrote:
IronSlayer wrote:
Very true, but you also don't want to give them a conceptual definition that is plain wrong, like you did.
I get the impression (not only from this thread but also from other threads) that you are trying to troll me, for whatever reason. Cut it out, will you?
Not at all. I'm a scientist by work and education, so I was interested in the confusion surrounding the concept of entropy. (Which is a fascinating subject) Believe me, it has nothing to do with you or any other member. Then again, this also isn't the first time I've noticed you suffering from persecution complex, either. The world doesn't revolve around you, Warp.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
IronSlayer wrote:
I'm a scientist by work and education, so I was interested in the confusion surrounding the concept of entropy.
Then perhaps Mr Scientist would be so nice and go and fix the Wikipedia page on entropy because it's so plain wrong. May I ask in which field of science you have a degree, and which degree? Personally I have a degree in computing science, but that doesn't make me an expert on physics (much less thermodynamics), nor do I pretend to have such expertise (I have clearly stated that the definition I quoted is an informal one, most probably not the most accurate and exact one). Saying "I'm a scientist, hence you should believe me" isn't a very convincing argument in itself.
Player (80)
Joined: 8/5/2007
Posts: 865
Warp wrote:
IronSlayer wrote:
I'm a scientist by work and education, so I was interested in the confusion surrounding the concept of entropy.
Then perhaps Mr Scientist would be so nice and go and fix the Wikipedia page on entropy because it's so plain wrong. May I ask in which field of science you have a degree, and which degree? Personally I have a degree in computing science, but that doesn't make me an expert on physics (much less thermodynamics), nor do I pretend to have such expertise (I have clearly stated that the definition I quoted is an informal one, most probably not the most accurate and exact one). Saying "I'm a scientist, hence you should believe me" isn't a very convincing argument in itself.
To your credit, Warp, you seem to have moderate know-how when it comes to physical concepts, even if you lack the nuts and bolts needed to solve physical problems. I think Marzojr has us both beat in that respect.
Joined: 5/30/2007
Posts: 324
Warp wrote:
Then perhaps Mr Scientist would be so nice and go and fix the Wikipedia page on entropy because it's so plain wrong.
I mentioned I was a scientist in the context of being interested in the definition of entropy, not because it means you should believe me. In fact, I would hope everyone here is persuaded by logical arguments, not sheer authority. As for Wikipedia, heh, don't remind me. I'm an editor there (I mainly concentrate on chess articles) and forcing through some of the edits can be excruciating...
Warp wrote:
May I ask in which field of science you have a degree, and which degree?
I have a bachelor's degree in math and I do research in mathematical economics. However, I seriously considered majoring in physics instead and have taken a number of advanced classes in it, including statistical mechanics, thermodynamics, and quantum mechanics. (The topics being discussed)
Warp wrote:
Personally I have a degree in computing science, but that doesn't make me an expert on physics (much less thermodynamics), nor do I pretend to have such expertise (I have clearly stated that the definition I quoted is an informal one, most probably not the most accurate and exact one). Saying "I'm a scientist, hence you should believe me" isn't a very convincing argument in itself.
I'm not an "expert" on physics though, but it's a subject I enjoy. Once again, you accused me of caring about you, and I merely replied why it is that I care about physics, not random forum members.
Player (80)
Joined: 8/5/2007
Posts: 865
IronSlayer wrote:
Once again, you accused me of caring about you, and I merely replied why it is that I care about physics, not random forum members.
Be the bigger man, IronSlayer. Give it a rest.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
IronSlayer wrote:
I have a bachelor's degree in math
Then I outrank you because I have a master's degree, so my authority is bigger than yours, ha! ;) Seriously, though (and honestly not to attack or belittle you), arguing that you object to someone's definition of entropy "as a scientist" is not very honest if what you have is an education and minor degree in math. (That's not to say that you are not competent in physics. It's just that using the label "scientist" is dishonest if you are not really a scientist in the correct field of science.)
Joined: 5/30/2007
Posts: 324
Warp wrote:
IronSlayer wrote:
I have a bachelor's degree in math
Then I outrank you because I have a master's degree, so my authority is bigger than yours, ha! ;) Seriously, though (and honestly not to attack or belittle you), arguing that you object to someone's definition of entropy "as a scientist" is not very honest if what you have is an education and minor degree in math. (That's not to say that you are not competent in physics. It's just that using the label "scientist" is dishonest if you are not really a scientist in the correct field of science.)
Dude, I've humored you and and answered your (irrelevant) questions. At the end of the day, your definition of entropy was wrong, quantitatively AND conceptually. That's the only thing that matters. The stuff about degrees, "experts", and accusing me of caring about you personally is all meaningless BS. Besides the point. And as Bobo noted, I've humored your silly ramblings long enough.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
IronSlayer wrote:
The stuff about degrees, "experts", and accusing me of caring about you personally is all meaningless BS. Besides the point. And as Bobo noted, I've humored your silly ramblings long enough.
I must admit you are a master at trolling. I tried to lighten the mood with my last post with some humor, and start an actual discussion, to no avail. You still keep belittling and insulting me, and presenting claims with no arguments whatsoever, and succeed it doing it in the most annoying way possible. If you were trying to get on my nerves, then congratulations, you succeeded.
Player (80)
Joined: 8/5/2007
Posts: 865
I don't recall Warp saying, "MY DEFINITION OF ENTROPY IS THE RIGHT ONE! ANYONE WHO SAYS OTHERWISE IS LYING! I'M A SUPER-SMART MATH GUY AND THEREFORE YOU CANNOT QUESTION MY DEFINITIONS!" In fact, his actual quote was
Warp wrote:
Isn't a more modern, semi-informal definition that entropy describes how much energy there is available for useful work in a system?
(Note the question mark.) So I have a much simpler answer that should satisfy everyone: "No."
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
Bobo the King wrote:
So I have a much simpler answer that should satisfy everyone: "No."
I got the notion from wikipedia, where it's described exactly like that. If it's incorrect, then what would be a more correct description, in layman terms? (I'm honestly asking because I want to know.) A more common (and probably older) description is that entropy describes the amount of "chaos" in the system, but AFAIK that's an even more vague and inaccurate description. In some sense it might be equivalent to "amount of energy available for useful work" (if we understand that as some kind of ordering, and hence the less ordered the system, the less able it is to perform useful work). Also, equating entropy with chaos leads to tons of confusion.
Player (80)
Joined: 8/5/2007
Posts: 865
Warp wrote:
Bobo the King wrote:
So I have a much simpler answer that should satisfy everyone: "No."
I got the notion from wikipedia, where it's described exactly like that. If it's incorrect, then what would be a more correct description, in layman terms? (I'm honestly asking because I want to know.) A more common (and probably older) description is that entropy describes the amount of "chaos" in the system, but AFAIK that's an even more vague and inaccurate description. In some sense it might be equivalent to "amount of energy available for useful work" (if we understand that as some kind of ordering, and hence the less ordered the system, the less able it is to perform useful work). Also, equating entropy with chaos leads to tons of confusion.
Well, you're asking for a layperson's definition of a concept that's mathematically convoluted. If I were talking to my non-physicist friends or family, I'd just say entropy is the amount of disorder in a system. As far as I know, there's no middle ground between that and the formal, mathematical (or physical) definition. Fortunately, I'm not talking to friends or family, I'm talking to you, and I know you're not afraid to get your feet at least a little wet. The best I can do is walk you through a toy problem. The following is based on the overall magnetization of a paramagnetic substance, but in case you're not familiar with the physical laws governing that (though it's not especially crucial), we'll just work with a very large deck of cards. I have adapted this example from Daniel Schroeder's Thermal Physics and my own memory. Suppose you have an enormous shuffled deck of cards that are all either red or black. You do not know what ratio the two flavors of cards come in, nor do you need to. You deal off a substantial number of them, say 100, then count up the number of red cards and subtract from them the number of black cards. This is your macrostate. It is what we can observe macroscopically (perhaps we have a machine that can count the number of red and black cards, but doesn't tell us what order they came in). What we are interested in is the number of microstates corresponding to this equivalent macrostate-- i.e., how many ways we might have achieved the same number. For example, if we found 99 red and 1 black, the number of microstates corresponding to the macrostate would be 100-- the first card could be the black one, or the second one, or the third one, etc., all the way up to the 100th one. The general formula for the number of microstates corresponding to a single macrostate in this system is Ω = ncr(N, Nblack) = N!/(Nred!*Nblack!) where N is the total number of cards, Nred is the number of red cards, and Nblack is the number of black cards. You can quickly check that Ω is 1 if all the cards are either red or black (there is clearly just one microstate that corresponds with each of these macrostates) and Ω is 100 if just one card is red or black, as I showed earlier. If the deck is fair, then the most "likely" state is with half the cards red and half of them black which can be analyzed using Stirling's approximation to yield an Ω of roughly 10^29. Clearly, as the distribution of cards gets closer to 50-50, the system will strongly gravitate toward there being 50 red cards and 50 black cards (+7 or so, according to probability theory). So let's say there's a 50-50 distribution of red and black cards. That's the macrostate and I've already computed the corresponding number of microstates. What is the entropy of the system? We have to do two things: First, it's a large number that will grow catastrophically (more than exponentially) as the size of the system increases. Second, it has the wrong units. To solve the first problem, we take the natural log. To solve the second, we multiply by the Boltzmann constant, 1.38*10^-23 J/K. The entropy of this deck of card system is therefore 9.22*10^-22 J/K. Note that, consistent with the intuitive definition, the entropy would be very low if many of the cards were red or black and it is at its highest when half the cards are red and half are black. As it turns out, this definition happens to be consistent with the definition demanded by the first law of thermodynamics: dE = T*dS - P*dV + mu*dN. I cannot show this for you, not because it is too complicated but because I don't regularly use statistical mechanics and I am not familiar with the proof offhand. If you followed the derivation up to this point and are still interested, I'll try to look it up. Almost all derivations of entropy are dependent on the ergodic assumption which basically states that all microstates are equally probable (in this context, it would mean the deck is well-shuffled). As a counterexample to this, suppose there is an ever so slight electrostatic attraction between red and black cards, due to the inks used to make them. In this case, you would be slightly more likely to draw a red followed by a black or a black followed by a red rather than two consecutive cards of the same color. Trying to account for this will wreak havoc on your probability distribution, rendering the problem practically unsolvable (I think you can get around it by using techniques like the Metropolis algorithm, but this is getting too technical even for me). Thankfully, the ergodic assumption is at least very plausible for systems near their most probable state, but it is nevertheless unsettling from a theoretical standpoint (at least I think so). If you want to apply this to other systems-- say, an ideal gas-- you would determine the macrostate (the pressure, temperature, and volume), then divide up the position and momentum space of the system into equal quantum bins and examine the probability for each particle to be found in a bin. This leads to popular statements like, "The probability that all the air molecules in this room will suddenly rush to one side and leave a vacuum on the other is vanishingly small." I might have even been able to carry out that calculation for you a few months ago, but statistical mechanics is all a blur to me. Edit: By the way, I think the integral of the temperature with respect to entropy from a system's initial state to its maximally "scrambled" state is the exergy as long as it's not changing in volume or allowed to exchange particles with the universe. This is just a guess, though. At the very least, it should provide an upper bound to the exergy in such cases. Because the conditions I placed are so finicky, I'm not surprised they give exergy its own definition.
Tub
Joined: 6/25/2005
Posts: 1377
Is there a rigorous definition of "macrostate", or would you just pick any macrostate you like? For example, with the card deck we might also observe "cards angled slightly to the right when dealt" and "cards angled slightly to the left when dealt". We can use your formulas on that to get an entropy value, but it doesn't tell us anything about red/black any more, but about the dealer's hands. Can both be the entropy of the system, despite being different? For the gas, you're naming pressure, temperature and volume. Do you observe them separately? Are they a combined macrostate? Also, how would you determine the entropy of a card deck dealt inside an ideal gas? Can we even say "The entropy of this system is X", or do we have to add "..with respect to macrostate A" every time? And if so, how do we avoid comparing apples with oranges when dealing with entropy?
m00
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
I think part of the confusion here is that, it seems, there are two ways of interpreting entropy, the classical viewpoint and the statistical viewpoint.
Wikipedia wrote:
Thermodynamic entropy is more generally defined from a statistical thermodynamics viewpoint, in which the molecular nature of matter is explicitly considered. Alternatively entropy can be defined from a classical thermodynamics viewpoint, in which the molecular interactions are not considered and instead the system is viewed from perspective of the gross motion of very large masses of molecules and the behavior of individual molecules is averaged and obscured. (...) From a macroscopic perspective, in classical thermodynamics the entropy is interpreted as a state function of a thermodynamic system: that is, a property depending only on the current state of the system, independent of how that state came to be achieved. The state function has the important property that, when multiplied by a reference temperature, it can be understood as a measure of the amount of energy in a physical system that cannot be used to do thermodynamic work; i.e., work mediated by thermal energy.
Player (80)
Joined: 8/5/2007
Posts: 865
Tub wrote:
Is there a rigorous definition of "macrostate", or would you just pick any macrostate you like?
As far as I'm aware, there is no such rigorous definition, but it's not difficult to come up with intuitive ones. We will never have the microstate of a large system (too many atoms to keep track of all at once, plus you would then start running into problems with your observations affecting the state). Whatever the macrostate is, it had better be something you can measure... macroscopically. It might be the difference in height of a mercury column that is open to the atmosphere on one end and in a vacuum on the other end (pressure) or it might be the height of a liquid in a bulb as it expands to fill a thin tube (temperature). Whatever it is, the observation should be statistically-based, usually around some mean value of some sort. While it isn't the definition of temperature, it can be shown that the temperature in many physical systems is proportional to the average kinetic energy of its component molecules. Some molecules will move a little fast, some a little slow, but temperature is your mean, macroscopic measurement. Perhaps the best way to "answer" your question is to ask you what your measuring device is. When properly interpreted, it should give a description of the macrostate.
Tub wrote:
For example, with the card deck we might also observe "cards angled slightly to the right when dealt" and "cards angled slightly to the left when dealt". We can use your formulas on that to get an entropy value, but it doesn't tell us anything about red/black any more, but about the dealer's hands. Can both be the entropy of the system, despite being different?
Let's say the system is quantized so that every card either tilts slightly to the left or slightly to the right (they act like spin-1/2 particles, by analogy). They never tilt more or less to one side, nor do they point straight up. In this case, we have two observations: red vs. black and tilted left vs. tilted right. If these variables can only be observed independently, we would need to revise our definition of the entropy to include not only red minus black but also tilted right minus tilted left. If the variables can be observed simultaneously, hoo boy. Then we need to categorize it into red tilted left, red tilted right, black tilted left, and black tilted right and then I'd need to know how you're measuring the macrostate since you should have a single scalar quantity out of all those different possibilities. In either case, however, we follow the exact same process as before: count up all the microstates corresponding to the macrostate we observe. I believe this kind of bookkeeping of degrees of freedom comes up all the time in statistical mechanics. If you "forget" a degree of freedom, you will make different predictions about the system (if you're lucky, it won't affect the system appreciably in the temperature regimes you're interested in).
Tub wrote:
For the gas, you're naming pressure, temperature and volume. Do you observe them separately? Are they a combined macrostate?
I believe they are a combined macrostate (any three of the four relevant quantities will completely define the state of an ideal gas). Don't blindly trust me on that. (Or any of this, actually. I'm doing my best to draw on my intuitive understanding of Stat Mech. I am woefully bad at tackling real problems, but I hear that's not such a big deal when it comes to Stat Mech.)
Tub wrote:
Also, how would you determine the entropy of a card deck dealt inside an ideal gas?
*SLAP* In all seriousness, this is actually an easy problem to tackle. Unless conditions are extremely unusual (many cards, low pressure, low temperature, tiny volume of gas), the ideal gas is going to have many orders of magnitude more entropy than the deck of cards. Therefore, the entropy of the cards won't add an appreciable amount of entropy. You can see this in the roughly 10^-21 J/K result I offered earlier for 100 cards. I believe that the entropy of a typical ideal gas (the kind you'd be able to deal cards in) is roughly of order unity in J/K. It's basically a logarithm (order unity, plus or minus a few orders of magnitude) times the Boltzmann constant, times the number of atoms. Since we're talking moles of atoms, we end up with order unity. For a more thorough answer, as above, you would just add their individual entropies. This is because the ideal gas shouldn't affect the entropy of the deck or vice-versa. Microstates of the deck are totally independent of microstates of the gas, so to find the total number of microstates of the combined system, just multiply the number of microstates of the component systems together. Since we take the log of this number, this is equivalent to adding the entropies.
Tub wrote:
Can we even say "The entropy of this system is X", or do we have to add "..with respect to macrostate A" every time? And if so, how do we avoid comparing apples with oranges when dealing with entropy?
This is a very good question and I'm not sure how to answer it. I would say that certainly if you have enough physical quantities to completely define the macroscopic state of the system, then your entropy must be consistent, regardless of what quantities you are measuring. I believe, however, that if you have forgotten a crucial measurement (say you measured the pressure, temperature, and volume but not the net electric dipole moment, which happens to be measurable and nonzero), you don't get away with the entropy you calculate. The entropy of the fully-described system is accurate while the entropy you have is incorrect. This keeps you from gaining information via willful ignorance (something Bayesians think they can do).
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
If we have a closed system that does some thermodynamic work, over time it will be able to do less and less work as entropy inside the system increases (until at some point it will not be able to do any work at all because entropy is too large). If we have two such closed systems, identical in all other aspects except that in one of them entropy is significantly larger than in the other, and both of them are put to do the same work, then the one with the larger entropy will be able to do less work than the one with the lower entropy. Hence there's a strong correlation between entropy and the amount of work that a closed system can do. What's the catch? I still don't quite understand why describing entropy as "a thermodynamic property that can be used to determine the amount of energy available for useful work" is so wrong. What exactly is wrong about it?
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Well, I don't think anybody said that entropy can't be used to determine how much work a system can do. What we said is just that entropy and energy available are different things. They certainly are related, but they're not the same.
Joined: 5/30/2007
Posts: 324
Tub wrote:
Is there a rigorous definition of "macrostate", or would you just pick any macrostate you like? For example, with the card deck we might also observe "cards angled slightly to the right when dealt" and "cards angled slightly to the left when dealt". We can use your formulas on that to get an entropy value, but it doesn't tell us anything about red/black any more, but about the dealer's hands. Can both be the entropy of the system, despite being different? For the gas, you're naming pressure, temperature and volume. Do you observe them separately? Are they a combined macrostate? Also, how would you determine the entropy of a card deck dealt inside an ideal gas?
Well, the macrostate itself is a measurable quantity. (99 red cards, 1 black card, in the example) However, the way in which you define and measure macrostates is indeed a definable quantity. ("the number of red and black cards) That's why there's an "entropy of information" and other similar applications. Your definition for the entropy of a card deck is equally valid to counting the number of reds and blacks.
Tub wrote:
Can we even say "The entropy of this system is X", or do we have to add "..with respect to macrostate A" every time? And if so, how do we avoid comparing apples with oranges when dealing with entropy?
You don't have to add the qualifier every time, since it's usually understood what you're talking about. Also, I can't think of an example where one would be manipulating entropies of macrostates calculated through two different approaches. If you did, you would just calculate one of them through the same approach as the other macrostate.
Banned User
Joined: 3/10/2004
Posts: 7698
Location: Finland
p4wn3r wrote:
Well, I don't think anybody said that entropy can't be used to determine how much work a system can do. What we said is just that entropy and energy available are different things.
And I never said they are the same thing. Let me quote my original text:
Warp wrote:
Isn't a more modern, semi-informal definition that entropy describes how much energy there is available for useful work in a system
"Entropy describes" != "entropy is". (Granted, perhaps "describes" is a bit vague, and a better expression would be "can be used to express/measure" or "is directly correlated to".)
Player (42)
Joined: 12/27/2008
Posts: 873
Location: Germany
Or the problem is that you used the word "definition", because defining something is exactly saying what this thing is. Whatever, if it's not what you meant, then it's not what you meant :P
Player (80)
Joined: 8/5/2007
Posts: 865
Warp wrote:
If we have a closed system that does some thermodynamic work, over time it will be able to do less and less work as entropy inside the system increases (until at some point it will not be able to do any work at all because entropy is too large). If we have two such closed systems, identical in all other aspects except that in one of them entropy is significantly larger than in the other, and both of them are put to do the same work, then the one with the larger entropy will be able to do less work than the one with the lower entropy. Hence there's a strong correlation between entropy and the amount of work that a closed system can do. What's the catch? I still don't quite understand why describing entropy as "a thermodynamic property that can be used to determine the amount of energy available for useful work" is so wrong. What exactly is wrong about it?
Coming back to this problem fresh, I think I've spotted the flaw. To remove useful energy from the system, it must do work. In thermodynamics, work is the P*dV term in the first law. When I suggested that integral of T*dS from the initial to the final state might be equal to the exergy if the volume is held constant, I was wrong because if the volume is constant, no work is being done. Therefore, exergy is surely related to entropy in a nontrivial way. For a closed system (energy conserved), the change in heat is in fact equal to the work done by the system. That follows directly from the first law of thermodynamics and I should have been much quicker to spot it.
1 2
14 15 16