Sean, Wow! What a ride over the past several months. Thanks so much for making this information available to the average Joe (or Josie) on the street. I'll be listening to it all again!
@@PrimatoFortunato Yes, I keep voting against conservation of energy because I want infinite battery life, but powerful tech lobbies keep that law in place.
What a great series! I've been through more in-depth videos with more math, but they tend to get very dry and tiring to follow. Your series should be a per-requisite for other in-depth lectures, because they make everything make so much more sense!
I didn't see this one until Monday morning, but as always, it was well worth the wait. Thanks again Professor Sean, for this amazing series. I am only sorry that we are near the end. Though there is still one more, yes ? This whole series has made this tiresome Lock-down bearable. I will be watching all the episodes again anyway. And I am really grateful that I could actually understand so much, without being able to do calculus.
Thanks, Professor! I feel like I could listen to you talk about Entropy forever, but that's obviously me not taking into account a possible change in my attitude at long enough periods.
When I attempted to give you a wholehearted "thumbs up," the number of "likes" was 137. So I made it 138. This was the very least I could do to quiet our minds about complexity.
To measure complexity, you should not measure the description of the image, but the description of the probability distribution from which the image is sampled. That would be simple on both extremes of entropy, but large in the middle.
Has anyone tried analyzing the Fourier transform in relation to complexity and entropy? It seems like that would capture scale dependent behaviors better than linear comparison.
It would be simply too awesome that Dr.Carroll gave a course about statistical mechanics in this way. Somehow the XIXth century does not get the spotlight it deserves, be it in science or history. In science it’s maybe because the XXth was awesome, historically it’s probably because western civilizations were all about conquering and subjugating. Stil, many social questions we still struggle with today were posed there and then. Where was I? Yes! Statistical mechanics. Gotta love it.
It may be fruitful to ask how does the macrostate evolve vs. the information that you need to distinguish that macrostate from all others. The separated cream state needs just a few bits of information. The state with swirls needs a ton of information if you're trying to reproduce the *specific* swirls. But having generic swirls evolves from the separated state without adding any information. Thus distinguish the absolute rarity of the branch (initial information) from how the branch itself evolves. If you accept all branching as real, like in many worlds, the 2nd law should be stated as conservation of the information that selects a branch while all the sub-branches are expanded out. Alternatively you could say that branching, or to be precise decoherence and self-locating, adds information to define a sub-branch. The 2nd law is keeping track of the information that defines the branch, in either perspective.
Consider average meteorite size with a sample spanning a few years... with a time including Tunguska... with a time including Chicxulub. I think the "meaningfulness" of the answer is that of the same time scale as was used in the original data: include next year's data, and the average won't change. But even with a complete characterization of the data over theoretical "all time" (via knowledge of NEO objects), how _useful_ is that average? For planning purposes, you want a value that matches the time interval of applicability. Building all rooftops to withstand the "average" meteorite size is a waste of expense since the average is inflated by rare events that the entire building would not survive anyway.
So if you had a gas (or any substance really) in a very good isolating container, like a thermos, would it tend to approach a uniform distribution rather than a M-B distribution? Does it look more M-B the better the thermal coupling and more uniform the more isolated? What about the universe as a whole: with no "outside" will M-B not apply?
In thermal equilibrium it is always the M--B, it has nothing to do with coupling. Also, even when coupled we reach thermal equilibrium eventually. If the Universe is a non expanding static Isolated system you will get the M-B eventually. BTW The MB depends on temperature. The Universe is expanding - so I am not sure if it will ever reach an equilibrium and if so then we may never get the MB. Then there is the question if the Universe is truly an Isolated system or not. BTW - You can have a maximum entropy even though not a Uniform distribution. BTW Very low temperatures gives a MB that is essentially spiked near zero velocity or 'uniform' in the sense every atom has close to zero velocity. My guess is eventual heat death and essentially near zero Temperature and essentially a spiked distribution around zero velocity. ALL THAT IS CLassical SM and Thermo. Not sure how QFT will modify all that if at all.
Not Sean, but I heard him discuss this before. Life reverses entropy locally. It takes high entropy matter and energy and uses it to build lower entropy structures, then expels high entropy waste energy. The whole system doesn't brake the 2nd law, but there are pockets where it is reversed.
Isn't the change in energy change under expansion somewhat analogous to what happens when you boost relative to a source. If you look at photons arriving from the source they will be doppler shifted. If you look at arriving non-relativistic particles ("matter") they won't. least way there energy won't in the same approx of neglecting Kinetic energy. While energy and ratio of matter to radiation energy will be different, nobody would call that non-conservation of energy. What expansion has done is boost relative to the source.
40:59 When you chuckle and say "Well it's funny...."I started laughing so hard that I couldn't hear what you were saying. Because I knew with out a doubt, that what ever you were saying, COULDN'T be possibly be funny. I ran it back four times before I was able to hear it without laughing hysterically. Of course I was right. But what WAS funny is that I laughed knowing I was right. I am sure that you don't have a staff of comedy writers but never the less you are on to something. I wish Hollywood would make a movie about a theoretical physicist who was popular and did have a staff of comedy writers to appear less nerdy but only succeeded in popularizing the serious study of science by comics and strippers I wouldn't watch it but I would love to see people of below average intelligence coming up with thought experiments. Well gotta practice my Barry Harris "borrowing diminished" movements.
41:28: What troubles me here is that compression algorithms operate on entropy (as defined by Shannon), not complexity. High-entropy sources are difficult to compress, low-entropy sources are easy to compress. If you have a completely random source you typically do not manage to compress it very well. (have you ever tried to compress white noise using JPEG - it's not pretty) At least that is true when it comes to lossless entropy encoding. JPEG is not lossless so you would be discarding parts of the source data and partially reconstructing it during decompression. To me it would seem like it would be easiest to compress the low-entropy source (the left-most coffee cup), somewhat harder do compress the medium entropy source (the partially mixed coffee cup) and very difficult to compress the highest entropy source where you have a completely random distribution of coffee vs. cream. However, what you are saying is that you apply some kind of coarse graining to the distribution in that case to get rid of all the randomness, which seems like cheating to me simply because it amounts to lowpass filtering / blur kernel processing or whatever you want to call it. You can just pick a coarse graining filter kernel that very efficiently removes the randomness in the high-entropy source and produces a very smooth, uniformly distributed source. (all pixels have the same gray color) Once you have found a way to reduce the entropy post-filtering to a point where it has less entropy than the middle coffee cup then that will have proven your point and the JPEG compression will be able to more efficiently compress that source. Had it not been for the coarse graining then you would not be able to prove anything here. Unlike your previous discussion about coarse graining in a previous episode when you argued that coarse graining is not arbitrary (that is, you are not allowed to pick a coarse graining strategy that lets you prove that e.g. the universe started in a high-entropy state and started evolving towards a low-entropy state because that is inconsistent with observational data - perfectly fair point) in this case it is utterly arbitrary. You are tossing out details to make your mixed coffee easier to compress - that is all you are doing. The reason you need to do this is because you depend on entropy being lower in the fully mixed coffee cup and that will never be the case without some kind of filtering. The problem is that you really need a quantitative definition of complexity and it seems like you don't want complexity to be the same as entropy but data compression does not care about complexity - only entropy - so it doesn't work in that context. 44:37: Or, are you saying that you pick a coarse graining filter kernel that you apply consistently in all steps of the experiment / simulation that is bound to always minimize the post-coarse graining entropy of the fully mixed coffee cup in such a way that regardless of what model for how the coffee and cream mix in the intermediate steps (in the middle coffee cup) you can show that the tectonic model for mixing give rise to higher complexity in the intermediate steps and then fall back to zero complexity at the end? Ok, that I can buy as long as you are consistent with how you coarse grain in all steps and do not tune your filter kernel simply to reach your desired conclusions.
Isn't the Conway's Game of Life (en.m.wikipedia.org/wiki/Conway's_Game_of_Life) a counterexample of a statement that local interactions cannot produce complexity?
I was going to mention something like this until I found your post. It seems that his description of the mixing had a lot of similarity to Conway's stuff and by extension to that of Steven Wolfram's New Science ideas of the simple rules that produce complexity.
I think your box analogy also is flawed. If I fill a box with light and expand it energy will still be conserved. The photons will not be reduced decreased in wave-length but only take longer to get across. I did notice you seemed to hesitate to use it.
e=mc2 m=e/c2 (3D printers creat through a single point) different way of explaining the same thing, entropy and thermodynamics, all laws of motion and particles right, wrong?..
There probably is life in some other part of universe. But I doubt it would be close enough and sophisticated enough to visit planet earth. Heck, if they are that capable they would probaly steer away from us.
If the universe was simple and as such did not have any parts to have diifferent microscopic configurations, then it can be thought of as a system with one particle. If we agree so far, then like single particle system, its micro and macroscopic state are identical, then it almost trivially follows as to why the entropy was low at the big bang. Why do we need a further explanation for that? What is the fallacy in this argument. This argument is basically following what Sean said - at the beginning universe was very small, dense and smooth ie simple. Of course then the real question is why was universe small, dense and smooth. Sure. I sometimes think of big bang this way - when we watch a big bomb explode from close distance we are blinded by the light of the explosion and cannot see things near it. It is just the light of the explosion. Is that the case with observable universe. Does the blinding light of the big bang and the wall of CMB prevents us from seeing ambient surrounding, extended universe?
Sean, Wow! What a ride over the past several months. Thanks so much for making this information available to the average Joe (or Josie) on the street. I'll be listening to it all again!
I vote for Carroll's Law: Everything interesting happens in the middle.
Awesome! You vote for physical laws in America? How cool is that!
Applies to the human body as well!
@@PrimatoFortunato Yes, I keep voting against conservation of energy because I want infinite battery life, but powerful tech lobbies keep that law in place.
What if the universe started in the middle?
@@ankiesiii in the middle of WHAT?
What a great series! I've been through more in-depth videos with more math, but they tend to get very dry and tiring to follow. Your series should be a per-requisite for other in-depth lectures, because they make everything make so much more sense!
Thanks for answering my question.
I didn't see this one until Monday morning, but as always, it was well worth the wait. Thanks again Professor Sean, for this amazing series. I am only sorry that we are near the end. Though there is still one more, yes ? This whole series has made this tiresome Lock-down bearable. I will be watching all the episodes again anyway. And I am really grateful that I could actually understand so much, without being able to do calculus.
51:13 Finally a good calculus lesson. Thanks! 😊
Thanks, Professor! I feel like I could listen to you talk about Entropy forever, but that's obviously me not taking into account a possible change in my attitude at long enough periods.
Or the heat death of the universe
Oh, Sean could talk about 'steam' for three hours straight, and I'd still watch/listen! :D
When I attempted to give you a wholehearted "thumbs up," the number of "likes" was 137. So I made it 138. This was the very least I could do to quiet our minds about complexity.
Results of this lesson, it's now late, I should have gone to bed, and I really want a coffee. Good talk tho
These videos have reached a new level of criticality and complexity!
Sean, thanks for the integration - that was a fun combination of both yelling at my phone and memories of high school!
This should be easy, you wrote a whole textbook on the subject.
I tell my students "all models are wrong, but some are useful" to get them in the right frame of mind.
Carroll, are these videos found any other websites? Such as not youtube. Thank you
Yeay! You're reminding me of Ilya Prigogne me Sean
To measure complexity, you should not measure the description of the image, but the description of the probability distribution from which the image is sampled. That would be simple on both extremes of entropy, but large in the middle.
Has anyone tried analyzing the Fourier transform in relation to complexity and entropy? It seems like that would capture scale dependent behaviors better than linear comparison.
It would be simply too awesome that Dr.Carroll gave a course about statistical mechanics in this way. Somehow the XIXth century does not get the spotlight it deserves, be it in science or history. In science it’s maybe because the XXth was awesome, historically it’s probably because western civilizations were all about conquering and subjugating. Stil, many social questions we still struggle with today were posed there and then.
Where was I? Yes! Statistical mechanics. Gotta love it.
Its interesting that your complexity vs entropy plot is reminiscent of the entropy expression p.log(p)
It may be fruitful to ask how does the macrostate evolve vs. the information that you need to distinguish that macrostate from all others. The separated cream state needs just a few bits of information. The state with swirls needs a ton of information if you're trying to reproduce the *specific* swirls. But having generic swirls evolves from the separated state without adding any information. Thus distinguish the absolute rarity of the branch (initial information) from how the branch itself evolves. If you accept all branching as real, like in many worlds, the 2nd law should be stated as conservation of the information that selects a branch while all the sub-branches are expanded out. Alternatively you could say that branching, or to be precise decoherence and self-locating, adds information to define a sub-branch. The 2nd law is keeping track of the information that defines the branch, in either perspective.
Cool. Thx for the video
❤ Very good 👍🏼
Consider average meteorite size with a sample spanning a few years... with a time including Tunguska... with a time including Chicxulub.
I think the "meaningfulness" of the answer is that of the same time scale as was used in the original data: include next year's data, and the average won't change.
But even with a complete characterization of the data over theoretical "all time" (via knowledge of NEO objects), how _useful_ is that average? For planning purposes, you want a value that matches the time interval of applicability. Building all rooftops to withstand the "average" meteorite size is a waste of expense since the average is inflated by rare events that the entire building would not survive anyway.
So if you had a gas (or any substance really) in a very good isolating container, like a thermos, would it tend to approach a uniform distribution rather than a M-B distribution?
Does it look more M-B the better the thermal coupling and more uniform the more isolated?
What about the universe as a whole: with no "outside" will M-B not apply?
In thermal equilibrium it is always the M--B, it has nothing to do with coupling. Also, even when coupled we reach thermal equilibrium eventually. If the Universe is a non expanding static Isolated system you will get the M-B eventually. BTW The MB depends on temperature. The Universe is expanding - so I am not sure if it will ever reach an equilibrium and if so then we may never get the MB. Then there is the question if the Universe is truly an Isolated system or not. BTW - You can have a maximum entropy even though not a Uniform distribution. BTW Very low temperatures gives a MB that is essentially spiked near zero velocity or 'uniform' in the sense every atom has close to zero velocity. My guess is eventual heat death and essentially near zero Temperature and essentially a spiked distribution around zero velocity. ALL THAT IS CLassical SM and Thermo. Not sure how QFT will modify all that if at all.
Hello Mr. Carroll. Is it possible to reverse entropy? Or is there insufficient data for a meaningful answer?
Not Sean, but I heard him discuss this before. Life reverses entropy locally. It takes high entropy matter and energy and uses it to build lower entropy structures, then expels high entropy waste energy. The whole system doesn't brake the 2nd law, but there are pockets where it is reversed.
With inputting energy yes.
Isn't the change in energy change under expansion somewhat analogous to what happens when you boost relative to a source. If you look at photons arriving from the source they will be doppler shifted. If you look at arriving non-relativistic particles ("matter") they won't. least way there energy won't in the same approx of neglecting Kinetic energy.
While energy and ratio of matter to radiation energy will be different, nobody would call that non-conservation of energy.
What expansion has done is boost relative to the source.
40:59 When you chuckle and say "Well it's funny...."I started laughing so hard that I couldn't hear what you were saying. Because I knew with out a doubt, that what ever you were saying, COULDN'T be possibly be funny. I ran it back four times before I was able to hear it without laughing hysterically. Of course I was right. But what WAS funny is that I laughed knowing I was right. I am sure that you don't have a staff of comedy writers but never the less you are on to something. I wish Hollywood would make a movie about a theoretical physicist who was popular and did have a staff of comedy writers to appear less nerdy but only succeeded in popularizing the serious study of science by comics and strippers
I wouldn't watch it but I would love to see people of below average intelligence coming up with thought experiments. Well gotta practice my Barry Harris "borrowing diminished" movements.
41:28: What troubles me here is that compression algorithms operate on entropy (as defined by Shannon), not complexity. High-entropy sources are difficult to compress, low-entropy sources are easy to compress. If you have a completely random source you typically do not manage to compress it very well. (have you ever tried to compress white noise using JPEG - it's not pretty) At least that is true when it comes to lossless entropy encoding. JPEG is not lossless so you would be discarding parts of the source data and partially reconstructing it during decompression. To me it would seem like it would be easiest to compress the low-entropy source (the left-most coffee cup), somewhat harder do compress the medium entropy source (the partially mixed coffee cup) and very difficult to compress the highest entropy source where you have a completely random distribution of coffee vs. cream. However, what you are saying is that you apply some kind of coarse graining to the distribution in that case to get rid of all the randomness, which seems like cheating to me simply because it amounts to lowpass filtering / blur kernel processing or whatever you want to call it. You can just pick a coarse graining filter kernel that very efficiently removes the randomness in the high-entropy source and produces a very smooth, uniformly distributed source. (all pixels have the same gray color) Once you have found a way to reduce the entropy post-filtering to a point where it has less entropy than the middle coffee cup then that will have proven your point and the JPEG compression will be able to more efficiently compress that source. Had it not been for the coarse graining then you would not be able to prove anything here.
Unlike your previous discussion about coarse graining in a previous episode when you argued that coarse graining is not arbitrary (that is, you are not allowed to pick a coarse graining strategy that lets you prove that e.g. the universe started in a high-entropy state and started evolving towards a low-entropy state because that is inconsistent with observational data - perfectly fair point) in this case it is utterly arbitrary. You are tossing out details to make your mixed coffee easier to compress - that is all you are doing. The reason you need to do this is because you depend on entropy being lower in the fully mixed coffee cup and that will never be the case without some kind of filtering. The problem is that you really need a quantitative definition of complexity and it seems like you don't want complexity to be the same as entropy but data compression does not care about complexity - only entropy - so it doesn't work in that context.
44:37: Or, are you saying that you pick a coarse graining filter kernel that you apply consistently in all steps of the experiment / simulation that is bound to always minimize the post-coarse graining entropy of the fully mixed coffee cup in such a way that regardless of what model for how the coffee and cream mix in the intermediate steps (in the middle coffee cup) you can show that the tectonic model for mixing give rise to higher complexity in the intermediate steps and then fall back to zero complexity at the end? Ok, that I can buy as long as you are consistent with how you coarse grain in all steps and do not tune your filter kernel simply to reach your desired conclusions.
I'll go ahead and recommend against JPEG size because it can vary considerably; you need to specify parameters
Because the middle mixing of cream and coffee was not special, I would not call it complex. Any other interim mix would be just as good.
Isn't the Conway's Game of Life (en.m.wikipedia.org/wiki/Conway's_Game_of_Life) a counterexample of a statement that local interactions cannot produce complexity?
I was going to mention something like this until I found your post. It seems that his description of the mixing had a lot of similarity to Conway's stuff and by extension to that of Steven Wolfram's New Science ideas of the simple rules that produce complexity.
I think your box analogy also is flawed. If I fill a box with light and expand it energy will still be conserved. The photons will not be reduced decreased in wave-length but only take longer to get across. I did notice you seemed to hesitate to use it.
e=mc2
m=e/c2 (3D printers creat through a single point)
different way of explaining the same thing, entropy and thermodynamics, all laws of motion and particles
right, wrong?..
cause and effect...
So 42 then. Thank you.
A great series coming to its conclusion - guess the final topic? Are we alone!?
Chaotic systems?
Life ? And / or the Origins of Life ?
There probably is life in some other part of universe. But I doubt it would be close enough and sophisticated enough to visit planet earth. Heck, if they are that capable they would probaly steer away from us.
Is this the last one or is number 24 gonna be the last of this series?
One more was said in the previous video in the series IIRC
@@Attlanttizz thank you
@@Attlanttizz I heard the same thing.
Caroll 2024
I bet Dr Alan Feldman (U. of Maryland, Los Alamos.Pasadena, Danny's basement) will see this!!!
Well did he?
Sean have you ever heard of the "grim reaper paradox"? interested in your take. I think it's pretty silly but
If the universe was simple and as such did not have any parts to have diifferent microscopic configurations, then it can be thought of as a system with one particle. If we agree so far, then like single particle system, its micro and macroscopic state are identical, then it almost trivially follows as to why the entropy was low at the big bang. Why do we need a further explanation for that? What is the fallacy in this argument. This argument is basically following what Sean said - at the beginning universe was very small, dense and smooth ie simple. Of course then the real question is why was universe small, dense and smooth. Sure.
I sometimes think of big bang this way - when we watch a big bomb explode from close distance we are blinded by the light of the explosion and cannot see things near it. It is just the light of the explosion. Is that the case with observable universe. Does the blinding light of the big bang and the wall of CMB prevents us from seeing ambient surrounding, extended universe?
Gravitational waves and neutrinos will let us see past the CMB.
Don’t understand “empty “ space in relationship to gravity
Is that a command, like "Have a nice day."?
John Długosz nope just don’t understand how anything can be empty
@@nancymencke6980 You mean - like most people cannot understand sentences without punctuation.
Empty space has a tiny bit of energy. That makes gravity.
What does a person do, who doesn't want to live, but doesn't want to die?
Takes a nap
Find a purpose in life. Like a hobby, a wife or husband, make kids, play chess, study physics, etc.
Vibrate. (Hopefully with some dampening.)
Take a first step.
Takes Schrödinger's cat out of the box and enters it instead.
First one😀!
You want a cookie? 😝
☺️☺️☺️
Congrats! ;)
@@seymoronion8371 ☺️☺️
W.O.W.
The real world is important? Preposterous!