What Is Entropy, Really?
Вставка
- Опубліковано 4 жов 2024
- Full video here: • How Did Life Arise fro...
Entropy is usually defined as "disorder," but this is not quite the correct way to think of it. A better and more precise way is to think of it as the number of ways that the microscopic components of the system can be arranged without affecting its macroscopic properties. High entropy systems can be arranged in more ways than low entropy systems. Often, this is indistinguishable from disorder, so "disorder" is the more simplified way that it is defined.
Full Video Here: ua-cam.com/video/CkAPhZ2QMg4/v-deo.html
If entropy is always increasing, then does it mean that there was 0 entropy at the big bang? Could the singularity not be disordered? If big bang singularities and black hole singularities are similar, does this imply that entropy at the centre of a black hole is 0? And what about before the big bang? Was there negative entropy?
@@dvans5435 Great question. There is really nothing like a zero entropy.except perhaps at absolute zero. But this would not have been the case at the Big Bang. Why entropy was low at the BB is not really known. Alan Guth has suggested that maximum entropy may be infinite, so no matter where you start, there is always some higher level of entropy. Entropy inside a black hole is actually quite high. At the Singularity, we don't know what exists, or whether a singularity itself even really exists. You can have a decreasing entropy, but overall entropy can never be negative.
@@ArvinAsh tnx
@@dvans5435
Yes.
Our universe is decreasing in entropy and will continue to decrease until it reaches entropy zero.
Crystallized, motionless.
Then it will explode, reversing course to increasing entropy, another big bang.
Eventually it will again become more ordered, moving to lower entropy.
This cycle is eternal.
Expansion and contraction, peaks and troughs of a giant wave.
Breathe in, breath out.
We are the cosmos, and the cosmos is us.
"measure of disorder" is a simplification used in lectures... Strictly and formally entropy is not a measure of disorder, but as you said a quantification of configurations
Or as the 2nd law of teenagedynamics states: in any closed bedroom, entropy always increases with time.
Until consciousness is factored in. It is an innate characteristic of the fabric of the universe, representing a counter-acting force, constantly 'luring' the physical world towards ever-increasing order. That's how sentient life exists.
I was not expecting that end sound effect 😭
Thank you. I’ve searched many videos, and this is the first on entropy I found that I can wrap my head around.
"order" and "disorder" are completely arbitrary terms which don't mean anything outside of human perception, like gas particles in a room that are in a state of equilibrium still look pretty ordered to me atleast. I always find it more helpful to think about how spread the energy in a system is, especially when talking about entropy in terms of the the age of the universe. Like, matter and energy will tend to spread out evenly overtime, that's heat death.
Disorder is certainly definable by humans. It is statistically random dispersion which cannot be defined in an orderly way. As the Universe as we know it continues to expand, so will random dispersion continue, no matter how smooth humans perceive it to be.
@@MIN0RITY-REP0RT My issue and the issue for most laypeople is that while yes statistically a randomly dispersed system is disordered, it still can seem pretty ordered to a common observer. I just find that when explaining entropy its better to think about order more in terms of dispersal. You're obviously correct though. I wasn't proposing changing our model of entropy at all, just considering a more intuitive way to explain it. I personally never really started to understand entropy until it was explained to me as it applies to ecology during one of my undergrad papers. Honestly, I've never been a fan of nomenclature in physics, like it's just that to me terms are often used in a way very different to their typical English meanings. (I know all scientific nomenclature is arbitrary, language itself is arbitrary) But obviously if I ever write about entropy academically I still use terms like disorder to describe it. Dispersal is just a better way to teach it, in my opinion :)
Order and disorder are not totally arbitrary words. Whether that has meaning outside of human perception is irrelevant and meaningless. Gas particles in a state of equilibrium could only be attained at absolute zero in which case it probably would look the same to you, we would have to acknowledge your subjective opinion regarding the "would pretty much look the same" thing. Otherwise a very thoughtful comment.
We literally have them defined, that’s the point of words.
@@davidtaylor-cc7ig Yeah, honestly. I probably should have clarified that my point is more just I personally find the terms order and disorder to be unhelpful when teaching someone entropy for the first time. I always found it easier to understand when it was explained in terms of dispersal. I think students sometimes struggle because particles in equilibrium can still look quite "ordered" subjectively. I suppose the issue is that all we have is human perception so what is outside of it is irrelevant. Thanks for your thoughtful and challenging response.
This is the only video that helped me understand, thank you
A neuron can have even 200,000 afferent (or input) synapses, and at most one axon, as output channel to myriads other neurons via its efferent (or output) synapses, which are afferent to the latter. That's the case of Purkinje cells in the cerebellum.
Each afferent synapsis can input a nervous signal. It may have nothing to do with the nervous signals other afferent synapses channel in. Therefore, there may be up to about 2²⁰⁰ ⁰⁰⁰ (or 10⁶⁶⁴ ⁰⁰⁰) different neuron-stimulating patterns; given that the number of particles in the universe is of the order of 10⁸⁴, and that the output axon signals have only two states (either an action potential, or spike, or its absence), we can say a neuron does an excellent job of reducing information entropy to a minimum.
It’s neguentropic process
Life processes are really really good reducing local entropy or keeping it low (of course by increasing the entropy of neighbouring systems). Entropy is a really fascinating concept to apply to ecology. David M. Wilkinson brings it into his Earth Systems model of ecology (which I have some issues with but I really like how it explains entropy and the idea the life is a "low entropy" island)
@@dfsnsdfn That's interesting. I always thought life would be high entropy.
@@Kj16V No, quite the opposite. Especially because a lot of the chemical processes involved in life are extremely complex, and therefore unlikely to happen spontaneously. Personally, I couldn't see life arising in a very high entropy system. In fact, you could say entropy increases when an organism dies and decays, but then other organisms are what is causing that decay, so lowering their own local entropy by increasing entropy elsewhere. I mainly do work in insect ecology (academically) and environmental risk analysis (privately), so no background in physics but I enjoy the topic and I think cross-disciplinary thinking is more important than ever these days.
@@dfsnsdfn you please explain what the cross-disciplinary means?
Nicely explained
I always found these analogies/definitions of entropy to be too subjective. What feels like "order" to you is a mental preference, not clearly an objective property of the universe. So the simplest explanation I think of is that more spaced apart = more possible states = higher entropy. Less spaced apart = less possible states = lower entropy.
You just said what he said.
I think this is what makes the abiotic synthesis of all ribonucleic monomers all the more fascinating
Life could be a low entropy but conserved pocket arising in generally disordered milieu to reinforce more entropy for the overall system. The living organism has to work hard to stay ordered against all forces. In doing so, it extracts and processes energy from its surrounding in a way that the surrounding increases in entropy faster than before.
This might be relevant. I have only just come across a concept called negentropy. This is defined as the entropy that a living system exports to keep its own entropy low. I am not fully understanding the concept, but it appears that all living systems have negentropy.
finalmente , alguien que explica que es la entropía y que le entiendo, Gracias.
Beautiful description!🤍
Best explanation yet
Thanks this is the only vid I was able to understand entropy better lol
This is how you can stir your tea and not end up back at the beginning of making it.
If it's spontaneous, then it's also thermodynamically favorable
Entropy can also be treated as the level of disinformation in a system. The more information about the state of every particle in a closed system you have, the less entropy it is. The devil of Maxwell is a good way to see entropy with this definition. Is widely used in wave treatment, like when sending signals with an antenna
Question: Is saying something having Entropy the same as saying something has potential? Like a glob of clay: Potential to become another shape, yet it's still clay? Thank you!
Spontaneous processes tends to move towards higher entropy. But stars are spontaneous, yet the accumulation of all that hydrogen into one spot is an example of decreasing entropy. So mostly, the universe has tended towards decreasing entropy so far. The diffuse hydrogen clouds are turning into little compact dots of hot stars - all the hot particles moving into one corner of the room?
The best physics teacher ❤️
"I am the Entropy Zero"
-Bad Cop from Entropy Zero
Boom...means everything decays, and eventually ends.
Everything tends toward entropy. Like metal rusting. In the end nothing will be left but empty space.
Inexact: il ne pas posible modifier un système et prétendre produire la misma configuración que possédait avant. L'entropie ne reflète pas la perte d'information d'un système. L'entropie est une manifestation de la irreversibilité des phénomènes physiques: inclure l'information dans l'entropie c'est une figure de style. L'information contenue dans les plus simples expresión des phénomènes physiques est incommensurable. Dans ce sens, l'information sur un état est une métaphore qui rends imaginable un état extrêmement complexe. Si l'information était quelque chose fundamental de l'entropie, nous aurions des taux d'entropie pour certains phénomènes. Mais nous n'avons pas des taux ou de constantes inhérents a l'entropie.
I just see entropy as distribution. Higher entropy = more distributed over space.
Thank you
Black holes are the lowest entropy.
=== Energy Always Flows Down Hill ===
Why is the universe not random?Or is it?
Which entropy is greater hot gas or cold gas? Since entropy always increase.
Think of energy levels in an atom.
By giving away Energy, you can reach all the Energy levels below you.
More energy means more reachable levels.
If the gas is hot, the average atom has more energy and thus can reach more energy levels by giving some away to neighbors.
Therefore there are more possible configurations and thus more entropy.
For a second I thought they meant the song
Indistinguishable
Nahh i was just searching the song entropy......mhmh yt did u just wanted me to work.🤦
What if I break the crystalline structures up into shards and shake the box in a zero-g environment? Isn't that kind of the same thing while maintaining a solid state? The gas would still have a higher entropy I suppose but their behavior would be similar
Thats a problem because theres an intervention on an outside force. E.g. what if an outside force rearrange the object back into its original state
Then this universe is not operating in a spontaneous way?
Meaning - It is Operated by Someone or Some Power Who is capable of doing that!!!
Thanks for giving more insight into Creation!
Fa Thabarkallhu Ahsanul haleqeen 🎉
The problem with this is he does not define DISORDER. Which is a PROBABILITY equal to the number of microstates in a macrostate divided by the total number of microstates in the system and is always between 0 and 1. What is confusing is to equate entropy with the total number of microstates in the system. This means the macrostate as just a descriptor of the whole system (like P, V and T) which it is not because P, V and T only apply to gasses while P and T only apply to liquids and T only applies to solids meaning it is not universal.
wtf how this is related to the integral of dU/T
Presumably the infinity of entropy is the ground state with lowest energy thus most stable like the primitive society with nothing organized thus never going broke
Entropy is the 😈 devil. I've been fighting him my entire life, but at least I know he is the orchestrator of his own demise.
So, in sci-fi and fantasy stories, the evil baddies often say they want to bring chaos by destroying the universe. Whereas in fact they would be bringing order.
Disorder*
I thought entropy was about decay?
You thought wrong -Shrek
You are not wrong in your thinking.
Decaying is an example of the direction of entropy in the universe. If you leave a dead body on the ground for a sufficient amount of time, it would lose its delicateness and become a disordered sturcture that has a higher entropy.
Why does one person say I'm wrong and another say I'm not wrong?
@@Kyle-nm1kh playing with ya
in thermodynamics it’s the energy lost to heat u guyz #2Spirit
yeah heat is the lowest form of energy once it’s there… yup
we actually do have a Master degree in chemical engineering
Mess
Entropy and the arrow of time is hard to understand, especially the Entropy of information. The 2nd Law is profound and disturbing.
Entropy generally defines ever-increasing and random disorder. It has nothing to do with the RATE of disorder.
Scientists can’t explain it because it’s not supposed to make sense
All of creation is running down to a least energy state. Just look at the tRumpublican party as an example.
Typical lib 🤡
Hey, remember what you thought about the obsessed folks who could turn a discussion on any topic into a rant against Obama?
Congratulations, that's you now.
Nope.
Entropy is thr breakung down of Order