What Is Entropy, Really?

Поділитися
Вставка
  • Опубліковано 4 жов 2024
  • Full video here: • How Did Life Arise fro...
    Entropy is usually defined as "disorder," but this is not quite the correct way to think of it. A better and more precise way is to think of it as the number of ways that the microscopic components of the system can be arranged without affecting its macroscopic properties. High entropy systems can be arranged in more ways than low entropy systems. Often, this is indistinguishable from disorder, so "disorder" is the more simplified way that it is defined.

КОМЕНТАРІ • 85

  • @ArvinAsh
    @ArvinAsh  Рік тому +7

    Full Video Here: ua-cam.com/video/CkAPhZ2QMg4/v-deo.html

    • @dvans5435
      @dvans5435 Рік тому +4

      If entropy is always increasing, then does it mean that there was 0 entropy at the big bang? Could the singularity not be disordered? If big bang singularities and black hole singularities are similar, does this imply that entropy at the centre of a black hole is 0? And what about before the big bang? Was there negative entropy?

    • @ArvinAsh
      @ArvinAsh  Рік тому +2

      @@dvans5435 Great question. There is really nothing like a zero entropy.except perhaps at absolute zero. But this would not have been the case at the Big Bang. Why entropy was low at the BB is not really known. Alan Guth has suggested that maximum entropy may be infinite, so no matter where you start, there is always some higher level of entropy. Entropy inside a black hole is actually quite high. At the Singularity, we don't know what exists, or whether a singularity itself even really exists. You can have a decreasing entropy, but overall entropy can never be negative.

    • @dvans5435
      @dvans5435 Рік тому +1

      @@ArvinAsh tnx

    • @Hogballs
      @Hogballs Рік тому

      @@dvans5435
      Yes.
      Our universe is decreasing in entropy and will continue to decrease until it reaches entropy zero.
      Crystallized, motionless.
      Then it will explode, reversing course to increasing entropy, another big bang.
      Eventually it will again become more ordered, moving to lower entropy.
      This cycle is eternal.
      Expansion and contraction, peaks and troughs of a giant wave.
      Breathe in, breath out.
      We are the cosmos, and the cosmos is us.

    • @86JEYPI
      @86JEYPI 5 місяців тому +1

      "measure of disorder" is a simplification used in lectures... Strictly and formally entropy is not a measure of disorder, but as you said a quantification of configurations

  • @kernicterus1233
    @kernicterus1233 Рік тому +60

    Or as the 2nd law of teenagedynamics states: in any closed bedroom, entropy always increases with time.

    • @Vunderbread
      @Vunderbread 6 місяців тому +3

      Until consciousness is factored in. It is an innate characteristic of the fabric of the universe, representing a counter-acting force, constantly 'luring' the physical world towards ever-increasing order. That's how sentient life exists.

  • @Uhhhnvm
    @Uhhhnvm 5 місяців тому +18

    I was not expecting that end sound effect 😭

  • @box-botkids3267
    @box-botkids3267 5 місяців тому +2

    Thank you. I’ve searched many videos, and this is the first on entropy I found that I can wrap my head around.

  • @dfsnsdfn
    @dfsnsdfn Рік тому +28

    "order" and "disorder" are completely arbitrary terms which don't mean anything outside of human perception, like gas particles in a room that are in a state of equilibrium still look pretty ordered to me atleast. I always find it more helpful to think about how spread the energy in a system is, especially when talking about entropy in terms of the the age of the universe. Like, matter and energy will tend to spread out evenly overtime, that's heat death.

    • @MIN0RITY-REP0RT
      @MIN0RITY-REP0RT Рік тому +3

      Disorder is certainly definable by humans. It is statistically random dispersion which cannot be defined in an orderly way. As the Universe as we know it continues to expand, so will random dispersion continue, no matter how smooth humans perceive it to be.

    • @dfsnsdfn
      @dfsnsdfn Рік тому

      @@MIN0RITY-REP0RT My issue and the issue for most laypeople is that while yes statistically a randomly dispersed system is disordered, it still can seem pretty ordered to a common observer. I just find that when explaining entropy its better to think about order more in terms of dispersal. You're obviously correct though. I wasn't proposing changing our model of entropy at all, just considering a more intuitive way to explain it. I personally never really started to understand entropy until it was explained to me as it applies to ecology during one of my undergrad papers. Honestly, I've never been a fan of nomenclature in physics, like it's just that to me terms are often used in a way very different to their typical English meanings. (I know all scientific nomenclature is arbitrary, language itself is arbitrary) But obviously if I ever write about entropy academically I still use terms like disorder to describe it. Dispersal is just a better way to teach it, in my opinion :)

    • @davidtaylor-cc7ig
      @davidtaylor-cc7ig Рік тому

      Order and disorder are not totally arbitrary words. Whether that has meaning outside of human perception is irrelevant and meaningless. Gas particles in a state of equilibrium could only be attained at absolute zero in which case it probably would look the same to you, we would have to acknowledge your subjective opinion regarding the "would pretty much look the same" thing. Otherwise a very thoughtful comment.

    • @cortster12
      @cortster12 Рік тому

      We literally have them defined, that’s the point of words.

    • @dfsnsdfn
      @dfsnsdfn Рік тому

      @@davidtaylor-cc7ig Yeah, honestly. I probably should have clarified that my point is more just I personally find the terms order and disorder to be unhelpful when teaching someone entropy for the first time. I always found it easier to understand when it was explained in terms of dispersal. I think students sometimes struggle because particles in equilibrium can still look quite "ordered" subjectively. I suppose the issue is that all we have is human perception so what is outside of it is irrelevant. Thanks for your thoughtful and challenging response.

  • @ghsbadgerfgb8953
    @ghsbadgerfgb8953 3 місяці тому +1

    This is the only video that helped me understand, thank you

  • @wafikiri_
    @wafikiri_ Рік тому +18

    A neuron can have even 200,000 afferent (or input) synapses, and at most one axon, as output channel to myriads other neurons via its efferent (or output) synapses, which are afferent to the latter. That's the case of Purkinje cells in the cerebellum.
    Each afferent synapsis can input a nervous signal. It may have nothing to do with the nervous signals other afferent synapses channel in. Therefore, there may be up to about 2²⁰⁰ ⁰⁰⁰ (or 10⁶⁶⁴ ⁰⁰⁰) different neuron-stimulating patterns; given that the number of particles in the universe is of the order of 10⁸⁴, and that the output axon signals have only two states (either an action potential, or spike, or its absence), we can say a neuron does an excellent job of reducing information entropy to a minimum.

    • @zedryelmeldrygor1070
      @zedryelmeldrygor1070 Рік тому +2

      It’s neguentropic process

    • @dfsnsdfn
      @dfsnsdfn Рік тому +2

      Life processes are really really good reducing local entropy or keeping it low (of course by increasing the entropy of neighbouring systems). Entropy is a really fascinating concept to apply to ecology. David M. Wilkinson brings it into his Earth Systems model of ecology (which I have some issues with but I really like how it explains entropy and the idea the life is a "low entropy" island)

    • @Kj16V
      @Kj16V Рік тому +1

      @@dfsnsdfn That's interesting. I always thought life would be high entropy.

    • @dfsnsdfn
      @dfsnsdfn Рік тому +5

      @@Kj16V No, quite the opposite. Especially because a lot of the chemical processes involved in life are extremely complex, and therefore unlikely to happen spontaneously. Personally, I couldn't see life arising in a very high entropy system. In fact, you could say entropy increases when an organism dies and decays, but then other organisms are what is causing that decay, so lowering their own local entropy by increasing entropy elsewhere. I mainly do work in insect ecology (academically) and environmental risk analysis (privately), so no background in physics but I enjoy the topic and I think cross-disciplinary thinking is more important than ever these days.

    • @leonmitas
      @leonmitas Рік тому

      ​@@dfsnsdfn you please explain what the cross-disciplinary means?

  • @aryadhankash
    @aryadhankash Рік тому +10

    Nicely explained

  • @extremeheat7947
    @extremeheat7947 Рік тому +8

    I always found these analogies/definitions of entropy to be too subjective. What feels like "order" to you is a mental preference, not clearly an objective property of the universe. So the simplest explanation I think of is that more spaced apart = more possible states = higher entropy. Less spaced apart = less possible states = lower entropy.

  • @sauce4335
    @sauce4335 Рік тому

    I think this is what makes the abiotic synthesis of all ribonucleic monomers all the more fascinating

  • @johnterry6541
    @johnterry6541 Рік тому +2

    Life could be a low entropy but conserved pocket arising in generally disordered milieu to reinforce more entropy for the overall system. The living organism has to work hard to stay ordered against all forces. In doing so, it extracts and processes energy from its surrounding in a way that the surrounding increases in entropy faster than before.

    • @noelwass4738
      @noelwass4738 Рік тому +1

      This might be relevant. I have only just come across a concept called negentropy. This is defined as the entropy that a living system exports to keep its own entropy low. I am not fully understanding the concept, but it appears that all living systems have negentropy.

  • @hopaideia
    @hopaideia 4 місяці тому

    finalmente , alguien que explica que es la entropía y que le entiendo, Gracias.

  • @candicemitchell6093
    @candicemitchell6093 4 місяці тому

    Beautiful description!🤍

  • @PVL14
    @PVL14 Рік тому

    Best explanation yet

  • @lotobloom9768
    @lotobloom9768 7 місяців тому

    Thanks this is the only vid I was able to understand entropy better lol

  • @TurinTuramber
    @TurinTuramber Рік тому +1

    This is how you can stir your tea and not end up back at the beginning of making it.

  • @TheRevelation_Official
    @TheRevelation_Official 5 місяців тому +1

    If it's spontaneous, then it's also thermodynamically favorable

  • @PenguinPotato97
    @PenguinPotato97 2 місяці тому

    Entropy can also be treated as the level of disinformation in a system. The more information about the state of every particle in a closed system you have, the less entropy it is. The devil of Maxwell is a good way to see entropy with this definition. Is widely used in wave treatment, like when sending signals with an antenna

  • @audioartisan
    @audioartisan 28 днів тому

    Question: Is saying something having Entropy the same as saying something has potential? Like a glob of clay: Potential to become another shape, yet it's still clay? Thank you!

  • @effectingcause5484
    @effectingcause5484 6 місяців тому

    Spontaneous processes tends to move towards higher entropy. But stars are spontaneous, yet the accumulation of all that hydrogen into one spot is an example of decreasing entropy. So mostly, the universe has tended towards decreasing entropy so far. The diffuse hydrogen clouds are turning into little compact dots of hot stars - all the hot particles moving into one corner of the room?

  • @science_spectra_
    @science_spectra_ Рік тому

    The best physics teacher ❤️

  • @anaxolotl6637
    @anaxolotl6637 5 місяців тому

    "I am the Entropy Zero"
    -Bad Cop from Entropy Zero

  • @ahoksbergen
    @ahoksbergen Рік тому

    Boom...means everything decays, and eventually ends.

  • @woodrowtaylor6907
    @woodrowtaylor6907 Рік тому

    Everything tends toward entropy. Like metal rusting. In the end nothing will be left but empty space.

  • @lopezarellanojose1460
    @lopezarellanojose1460 Рік тому

    Inexact: il ne pas posible modifier un système et prétendre produire la misma configuración que possédait avant. L'entropie ne reflète pas la perte d'information d'un système. L'entropie est une manifestation de la irreversibilité des phénomènes physiques: inclure l'information dans l'entropie c'est une figure de style. L'information contenue dans les plus simples expresión des phénomènes physiques est incommensurable. Dans ce sens, l'information sur un état est une métaphore qui rends imaginable un état extrêmement complexe. Si l'information était quelque chose fundamental de l'entropie, nous aurions des taux d'entropie pour certains phénomènes. Mais nous n'avons pas des taux ou de constantes inhérents a l'entropie.

  • @nanashipersonne4151
    @nanashipersonne4151 4 місяці тому

    I just see entropy as distribution. Higher entropy = more distributed over space.

  • @augurcybernaut4785
    @augurcybernaut4785 Рік тому

    Thank you

  • @zukodude487987
    @zukodude487987 Рік тому +1

    Black holes are the lowest entropy.

  • @kennethbransford820
    @kennethbransford820 Рік тому

    === Energy Always Flows Down Hill ===

  • @nsc2443
    @nsc2443 Рік тому

    Why is the universe not random?Or is it?

  • @jakubkusmierczak695
    @jakubkusmierczak695 Рік тому

    Which entropy is greater hot gas or cold gas? Since entropy always increase.

    • @carl6167
      @carl6167 5 місяців тому

      Think of energy levels in an atom.
      By giving away Energy, you can reach all the Energy levels below you.
      More energy means more reachable levels.
      If the gas is hot, the average atom has more energy and thus can reach more energy levels by giving some away to neighbors.
      Therefore there are more possible configurations and thus more entropy.

  • @TalxGachaEdits
    @TalxGachaEdits 7 місяців тому

    For a second I thought they meant the song

  • @MrofficialC
    @MrofficialC Рік тому

    Indistinguishable

  • @Kpopfanki
    @Kpopfanki 4 місяці тому

    Nahh i was just searching the song entropy......mhmh yt did u just wanted me to work.🤦

  • @phoenixcreations2201
    @phoenixcreations2201 Рік тому

    What if I break the crystalline structures up into shards and shake the box in a zero-g environment? Isn't that kind of the same thing while maintaining a solid state? The gas would still have a higher entropy I suppose but their behavior would be similar

    • @SilhSe
      @SilhSe Рік тому

      Thats a problem because theres an intervention on an outside force. E.g. what if an outside force rearrange the object back into its original state

  • @Tamilanban.T
    @Tamilanban.T Рік тому +1

    Then this universe is not operating in a spontaneous way?
    Meaning - It is Operated by Someone or Some Power Who is capable of doing that!!!
    Thanks for giving more insight into Creation!
    Fa Thabarkallhu Ahsanul haleqeen 🎉

  • @mikebellamy
    @mikebellamy Рік тому

    The problem with this is he does not define DISORDER. Which is a PROBABILITY equal to the number of microstates in a macrostate divided by the total number of microstates in the system and is always between 0 and 1. What is confusing is to equate entropy with the total number of microstates in the system. This means the macrostate as just a descriptor of the whole system (like P, V and T) which it is not because P, V and T only apply to gasses while P and T only apply to liquids and T only applies to solids meaning it is not universal.

  • @atreidesson
    @atreidesson 5 місяців тому

    wtf how this is related to the integral of dU/T

  • @zack_120
    @zack_120 Рік тому

    Presumably the infinity of entropy is the ground state with lowest energy thus most stable like the primitive society with nothing organized thus never going broke

  • @johnmneu
    @johnmneu 6 місяців тому

    Entropy is the 😈 devil. I've been fighting him my entire life, but at least I know he is the orchestrator of his own demise.

  • @Kj16V
    @Kj16V Рік тому

    So, in sci-fi and fantasy stories, the evil baddies often say they want to bring chaos by destroying the universe. Whereas in fact they would be bringing order.

  • @Kyle-nm1kh
    @Kyle-nm1kh Рік тому

    I thought entropy was about decay?

    • @Sehon13Ultd
      @Sehon13Ultd Рік тому +1

      You thought wrong -Shrek

    • @PK39719
      @PK39719 Рік тому

      You are not wrong in your thinking.

    • @dvans5435
      @dvans5435 Рік тому +1

      Decaying is an example of the direction of entropy in the universe. If you leave a dead body on the ground for a sufficient amount of time, it would lose its delicateness and become a disordered sturcture that has a higher entropy.

    • @Kyle-nm1kh
      @Kyle-nm1kh Рік тому

      Why does one person say I'm wrong and another say I'm not wrong?

    • @dvans5435
      @dvans5435 Рік тому

      @@Kyle-nm1kh playing with ya

  • @KhaoticDeterminism
    @KhaoticDeterminism Рік тому

    in thermodynamics it’s the energy lost to heat u guyz #2Spirit
    yeah heat is the lowest form of energy once it’s there… yup
    we actually do have a Master degree in chemical engineering

  • @GABBYAYYY
    @GABBYAYYY 4 місяці тому

    Mess

  • @dougieh9676
    @dougieh9676 5 місяців тому

    Entropy and the arrow of time is hard to understand, especially the Entropy of information. The 2nd Law is profound and disturbing.

  • @MIN0RITY-REP0RT
    @MIN0RITY-REP0RT Рік тому +1

    Entropy generally defines ever-increasing and random disorder. It has nothing to do with the RATE of disorder.

  • @chrisglover2697
    @chrisglover2697 Рік тому

    Scientists can’t explain it because it’s not supposed to make sense

  • @ancientheart2532
    @ancientheart2532 Рік тому

    All of creation is running down to a least energy state. Just look at the tRumpublican party as an example.

    • @extremeheat7947
      @extremeheat7947 Рік тому

      Typical lib 🤡

    • @ddichny
      @ddichny Рік тому

      Hey, remember what you thought about the obsessed folks who could turn a discussion on any topic into a rant against Obama?
      Congratulations, that's you now.

  • @ShaneOsborne
    @ShaneOsborne Рік тому

    Nope.
    Entropy is thr breakung down of Order