The Startling Reason Entropy & Time Only Go One Way!

Поділитися
Вставка
  • Опубліковано 27 кві 2024
  • Learn more about probability and Statistics at brilliant.org/ArvinAsh Get started for free, and hurry-the first 200 people get 20% off an annual premium subscription.
    PATREON:
    For Early Access, Input on New Videos, Personal answers to Questions, Join My Patreon:
    / arvinash
    REFERENCES:
    Arvin Ash video on Entropy, Time & information: • The Stunning link betw...
    Second Law of Thermodynamics & QM: tinyurl.com/2asekdfe
    Entropy and quantum information: tinyurl.com/2p4cdgtr
    CHAPTERS:
    0:00 Why do things tend towards their lowest energy?
    1:27 What is the Second Law of Thermodynamics?
    4:35 Why do things tend to go to their lowest energy state?
    7:03 How probability enters into the picture
    8:41 What is entropy REALLY and why does it only increase
    9:58 What increasing entropy implies for the Universe
    10:51 How entropy might be related to flow of time
    11:48 Learn more about statistics and probability at Brilliant
    13:16 Join our Patreon: / arvinash
    SUMMARY:
    What drives natural phenomena? Why does entropy only increase or stay the same. Why does a pencil on its tip fall to the table?
    Things always tend towards their lowest energy state. Why is nature driven this way?
    A simple way to think of entropy is disorder. But what’s driving this disorder? What is the underlying cause of higher entropy?
    Whenever something happens, some kind of change has to occur. Properties like speed, mass or temperature change. Physics defines the rules that allows us to predict these changes.But if something can change from say configuration A to configuration B, then why shouldn’t the reverse also be allowed from B to A?
    When a ball rolls down a hill from a certain height, it should be able to roll back up to the same height. If energy is conserved, then all its potential energy when it is on top of the hill is converted to kinetic energy at the bottom. And this kinetic energy should completely convert back to potential energy, back to the height that the ball originally started from. But this is not what happens. it is irreversible. We can scramble an egg, but we can’t unscramble it.
    Conservation laws can’t answer the question why there is a preferred direction to any particular change. Things move in a particular direction as time moves forward. The reason is not as mysterious as it might sound. It all has to do with probability.
    If we hold a pencil on its tip and let it go, it will fall and lie horizontally. Why did this happen? When it was held up, it had a high potential energy. When it hit the table, that kinetic energy was transferred to the table and air in the form of heat and sound. Energy is conserved. Buy why did this transfer of energy happen? Why doesn’t the reverse happen? Why doesn’t the energy of the air and heat configure itself in such a way that it results in the pencil standing up. Energy would be conserved in that case too.
    What’s really happening is that although energy is conserved, the type of energy has changed. Useful energy has converted to less useful energy. When a system is at a high energy state, it has more energy available to do work. When the pencil was standing on its tip, it had more potential energy. It had more ability to do work. After it hit the table, it had less energy available to do work.
    The energy that transferred to the table and air in the form of heat and sound is less useful. The same thing can be applied to a boulder falling from the top of a mountain to the bottom.
    The real question is why do some systems tend to transfer their energy to other systems? This is where probability comes in. When the pencil is standing up, there was just a single way you could arrange the potential energy of the atoms that comprise the pencil. Once the pencil has transferred this kinetic energy to the movement of atoms on the table and air in the form of heat and sound, there is an innumerably large number of ways that you could divide up that energy among all the atoms in the surrounding air and table.
    So while there are just a few ways that energy can go towards the motion of the pencil, there is a very large number of ways in which that energy could be distributed to the motions and vibrations of atoms in the surroundings. If we assume that the any possible distribution of energy is equally likely, then the case with a mind-bogglingly large number of possibilities is much more likely to occur. In fact, it will occur virtually all the time, and the case with just a few possibilities, basically would never occur.
    #entropy
    #timeandentropy
    Increasing entropy might actually be giving a meaningful definition to the flow of time in our universe. One way to think of time is as things changing. And since things are always changing in the higher entropy direction, we can distinguish higher entropy with the forward movement of time. If there were no change, then perhaps there would be no time.
  • Наука та технологія

КОМЕНТАРІ • 1,2 тис.

  • @stoneysdead689
    @stoneysdead689 Рік тому +14

    This is the first explanation of entropy that explains why the universe started out in such a low entropy state- because it was so much smaller than it is now, and so much simpler because the symmetries had yet to be broken- it just makes sense that there would've been far fewer ways that the system (universe) could be arranged. Why haven't I heard this before? I guess they thought it was just apparent but- I had never even thought about this at all- it's so simple but explains so much. Entropy itself makes more sense when you think about the energy as being "less useful" even though it's conserved. I am definitely going to check out more of your videos- bravo man, well done.

  • @krnathan
    @krnathan Рік тому +253

    This man deserves an award, for explaining complex concepts with so much creativity and simplicity!

    • @CosmicNihilist
      @CosmicNihilist Рік тому +6

      I agree wow what a stunning episode btw his lovely voice and tonality help too .

    • @Mr0rris0
      @Mr0rris0 Рік тому

      We send him a bonjovi pasta sauce

    • @hyperduality2838
      @hyperduality2838 Рік тому +2

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @tonib5899
      @tonib5899 Рік тому

      @@hyperduality2838 brilliantly put and citing Roger Penrose really nailed it.Well said.T

    • @naughtyat25
      @naughtyat25 Рік тому

      Totally! Never thought time could be consequence of entropy

  • @videosbymathew
    @videosbymathew Рік тому +83

    Finally, a great explanation of entropy. I always intuitive had this answer, but no one else ever seems to have described it correctly in videos before, thank you.

    • @JohnnyAngel8
      @JohnnyAngel8 Рік тому +3

      I agree. I've tried to wrap my brain around other attempts but the demonstrations and explanations in this video finally gave me a better understanding. Just don't ask me to recite it back to you! LOL!

    • @videosbymathew
      @videosbymathew Рік тому +1

      @@JohnnyAngel8 Lol, indeed!

    • @jsEMCsquared
      @jsEMCsquared Рік тому

      It cracks me up that entropy,. The theory that things are getting smaller, is actually getting bigger!

    • @elio7610
      @elio7610 Рік тому

      @@jsEMCsquared Since when was entropy about stuff getting smaller?

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

  • @Razor-pw1xn
    @Razor-pw1xn Рік тому +45

    Hi Arvin, could you make a video about the new discovery about the double slit experiment but instead of taking place in space it takes place in time? It would be very interesting to discuss what consequences it implies. "A team led by Imperial College London physicists has performed the experiment using 'slits' in time rather than space. They achieved this by firing light through a material that changes its properties in femtoseconds (quadrillionths of a second), only allowing light to pass through at specific times in quick succession"

    • @b1gb017
      @b1gb017 11 місяців тому +4

      yh I saw an article on that as well, seemed very interesting, would be good to have Arvin’s take on it! 😎

    • @Rami-ll2bq
      @Rami-ll2bq 8 місяців тому

      ❤🧠❤

  • @lookmath4582
    @lookmath4582 Рік тому +33

    Your videos have that really phiosophical metaphysical twist in them which makes them special and understandable 👍❤

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @georgerevell5643
    @georgerevell5643 Рік тому +9

    I love it when you go beyond the generally accepted answer and give your well thought through opinion. Your so brilliant Arvin!

    • @ArvinAsh
      @ArvinAsh  Рік тому +4

      I appreciate that, but it's not just my opinion. This is the generally accepted way, for now at least, that we understand why these things happen.

    • @evgenistarikov3386
      @evgenistarikov3386 11 місяців тому

      @@ArvinAsh Absolutely so, guys! My humble comment to this stream ought to shed light upon other plausible explanations as for WHY THINGS HAPPEN AT ALL.

    • @user-ky5dy5hl4d
      @user-ky5dy5hl4d 10 місяців тому

      @@ArvinAsh So, movement can be time. Well, consider a grandfather clock with wheels, pegs and pendulum. Looking at the working clock we see movements of all these coordinated parts of the impressive clock. Yet, we cannot say that we see ''time''. Clocks are not sentient devices. Yes, the clock itself undergoes changes but not because of the ''time'' but because of entropy which in this video is well defined. But the clock itself does not define ''time''. And like moving waves on the sea they are not sentient to ''time'' even though they seem to point to ''flowing time'' which actually does not flow for we would see the flow of time and not waves. Entropy is a form of disintegration of a system or rearrangement of a system which was in a state of entropy anyway before it went through another process of entropy. Thus ''time'' we are all familiar with cannot be flowing and entropy does not contain an ''element of time'' the clock is ''showing'' us. Time must be static and the only way ( at least for me ) any mathematical eqaution involving ''time'' must be preceded by the imaginary number i=square root of minus 1.

  • @Afrobelly
    @Afrobelly 9 місяців тому +7

    This was amazing. Until now I'd never been able to make sense of the concept of "increasing entropy in a system" which as a science concept seemed hopelessly vague. Explaining it as a function of end state probabilities made the connection for me. Thank you for that. As a side note, I once dropped a penny onto a table where it bounced a bit until finally coming to rest on edge. I couldn't believe my eyes. In amazement, I left that penny sitting there for maybe 20 minutes. But as you explained--improbable, not impossible.

  • @matkosmat8890
    @matkosmat8890 Рік тому +26

    Time as a statistical phenomenon! Thank you, Arvin, your explanations are always spot-on.

  • @johandam4992
    @johandam4992 Рік тому +9

    Way back when I graduated, my thesis was about exergy (usefull enthalpy for short explanation). This video captured that concept. This is quite an achievement

  • @bmaverickoz
    @bmaverickoz 11 місяців тому +9

    I like that you referenced Lee Smolin, who is so invested in probing the status of time of fundamental (well, causation anyway, which is manifestation of time). I think his work is really valuable in this space, and your popularising of the subject is truly inspired Arvin :)

  • @gafyndavies
    @gafyndavies Рік тому +5

    No matter how bad a day I'm having, when I hear "That's coming up, right now!" a huge smile erupts on my face 😊

  • @SergeyNeskhodovskiy
    @SergeyNeskhodovskiy 2 дні тому

    I also go from "high ability to do work", to "low ability to do work" exactly when my working day starts - now I know why! Thank you!

  • @macsarcule
    @macsarcule 11 місяців тому +8

    This so clearly answered questions I’ve been puzzling over for years. I have new questions to puzzle over now, and that’s extra awesome! Thank you, Arvin! ✌️🙂

  • @Laser593
    @Laser593 Рік тому +4

    Now I know why things keep changing in the universe. The culprit is Entropy. Everything changes because of transfer of energy to one thing to another. Thanks Mr. Arvin Nash.

  • @timjohnson979
    @timjohnson979 Рік тому +24

    "Time being a statistical phenomenon" is an interesting thought, but is that reality? If it is, then what does that mean in the context of spacetime?
    Love your videos, Arvin. They are clear and thought provoking.

    • @macysondheim
      @macysondheim 11 місяців тому

      All of this fancy science jargon is total nonsense. At the end of the day none of this bogus can be proven in a lab. It’s just blind faith.

    • @sunny_senpai
      @sunny_senpai 11 місяців тому

      would like to know this as well

  • @starrynightlyrics7559
    @starrynightlyrics7559 Рік тому +2

    I will remember your name ten years from now. You explained that concept about life better than my teachers ever did. THAAANKS A LOT !!!

  • @hisss
    @hisss Рік тому +6

    I may now, for the first time in my life, finally, have a slight bit of a beginning of an understanding of what entropy means. "Disorder" was always so vague, I never got who or what decided what was ordered and what was disordered. I don't quite speak maths, so things like statistics and probability are over my head, but at least the concept is starting to make sense. My teachers could never achieve that, so thank you. There might still be hope for me.

    • @evgenistarikov3386
      @evgenistarikov3386 11 місяців тому

      To continue moving along the way you have kindly chosen please check also my humble comment to this stream

    • @Afrobelly
      @Afrobelly 9 місяців тому

      Same here, but I think you said it more plainly than I could. Glad I stumbled onto this video lecture.

  • @jeffreymartin8448
    @jeffreymartin8448 Рік тому +4

    There is nothing more satisfying than when that light bulb goes on. Often seemingly on it's own when least expected. You realize: Of course ! Thank you once again Arvin!

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @AndrewUnruh
    @AndrewUnruh 8 місяців тому +3

    This was essentially a very good introductory lesson on statistical thermodynamics. I never really understood some thermodynamics concepts until I took a course on statistical thermodynamics as a grad student.

  • @duukvanleeuwen2293
    @duukvanleeuwen2293 Рік тому +44

    I like the way entropy and time are related to probability; never really thought about it that way 🤔

    • @ROHITKINGC
      @ROHITKINGC 11 місяців тому +2

      Boltzmann understood it 120 years ago, and then committed suicide.

    • @user-nu8in3ey8c
      @user-nu8in3ey8c 11 місяців тому

      @@ROHITKINGC Of course if it is related to probability, if you wait long enough all the particles will tunnel back to one spot and the big bang will happen again. Or even more likely according to Boltzmann is that particles would tunnel back to one spot to make a small isolated Boltzmann Brain. So Entropy can reverse if one looks at the math, you just have to wait long enough.

    • @JamesBrown-fd1nv
      @JamesBrown-fd1nv 11 місяців тому

      ​@@user-nu8in3ey8c there never was a big bang.

    • @user-nu8in3ey8c
      @user-nu8in3ey8c 11 місяців тому

      @@JamesBrown-fd1nv If there was not a big bang, then what was it? How did we get here?

  • @NoActuallyGo-KCUF-Yourself
    @NoActuallyGo-KCUF-Yourself Рік тому +13

    Great timing on this one! I watched it right before one of my chemistry students asked about thermodynamics. I definitely passed this on as a study aide.

    • @chriskennedy2846
      @chriskennedy2846 Рік тому

      I recommend the book: Introduction to Molecular Thermodynamics by Robert M Hanson and Susan Green. Easy to read and covers a lot of ground.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      @@hyperduality2838
      Will you please go away with this nonsense and stop spamming every comment with your crackpot ideas?

  • @superipermagererata5084
    @superipermagererata5084 Рік тому +25

    I really love your videos and the way you explain this complex things, you make them sound easy!, thanks to you I’ve learn so much things and my interest in science and physics has grown a lot and I discover my love for this subject :>

  • @juannarvaez5476
    @juannarvaez5476 9 місяців тому +1

    Reading one of Asimov's short stories about the question on entropy.
    I always loved the idea that after several cosmological decades. After protons decayed, after the universe has grown cold and dark. After an incalculable near infinit amount of time, an event so impropbable as to have a 0.0 and near continous zeros following it before a 1 finally appears. Where at that point entropy resets to what is was at the beginning of the big bang. "Let there be light" happens again, and another big bang or something similar occurs.

  • @thiagocastrodias2
    @thiagocastrodias2 11 місяців тому +1

    I remember listening once to an explanation, where you put a number of marbles inside a box (all grouped according to their colors) and shake it. Every time you will get a more homogeneous configuration of marbles. This is due mostly to simple probability, since there is a huge number of possible configurations, but only a few where the marbles are organized according to the colors. Now apply this to the Universe and its enormous number of atoms. It's also interesting because things like living organisms and human intention can break this trend and make stuff more organized. Natural selection and human intent and creativity doesn't operate in the level of pure chance, so naturally improbable states become more likely.

  • @Horribilus
    @Horribilus Рік тому +73

    Arvin’s qualitative explanations of these mathematically complex phenomena are especially valuable to me having been frustrated by the Principia, as well as his perfect articulation makes me say “eureka”.

    • @hyperduality2838
      @hyperduality2838 Рік тому +1

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @sethrenville798
      @sethrenville798 Рік тому

      ​@@hyperduality2838 Interesting take. I am more of the Stephen Wolfram mind in that I honestly just think everything in this universe is driven by computation, and the most fundamental building block of our universe is information, these lightning quick computations with drive entropy in an irreversible direction coma as the computations collapsed the wave functions of the probabilistic futures into actualized Specifics, and it actually directly works with the Maxwell's demon explanation of entropy. I also find that his quirk seems to work out quite a few interesting weighs in other parts of physics, as well as gravity, and has a Another set of really nifty applications Of the main physics equations, Einstein's, the dirac equation, and theYang-Mills

    • @hyperduality2838
      @hyperduality2838 Рік тому

      @@sethrenville798 Spin up is dual to spin down, particles are dual to anti-particles -- The Dirac equation.
      From a converging, convex (lens) or syntropic perspective everything looks divergent, concave or entropic -- the 2nd law of thermodynamics!
      All observers have a syntropic perspective according to the 2nd law of thermodynamics.
      My syntropy is your entropy and your syntropy is my entropy -- duality.
      Mind (the internal soul, syntropy) is dual to matter (the external soul, entropy) -- Descartes or Plato's divided line.
      Convex is dual to concave -- lenses, mirrors.
      Your mind is syntropic as it creates or synthesizes reality (non duality).
      Concepts are dual to percepts -- the mind duality of Immanuel Kant.
      Concepts are syntropic representations built from perceptions or measurements -- category theory.
      The Einstein reality criterion:-
      "If, without in any way disturbing a system, we can predict with certainty (i.e., with probability equal to unity)
      the value of a physical quantity, then there exists an element of reality corresponding to that quantity."
      (Einstein, Podolsky, Rosen 1935, p. 777)
      Internet Encyclopedia of Philosophy:-
      www.iep.utm.edu/epr/
      According to Einstein reality is predicted into existence -- a syntropic process!
      "We predict ourselves into existence" -- Anil Seth, neuroscientist, watch at 56 minutes:-
      ua-cam.com/video/qXcH26M7PQM/v-deo.html
      Making predictions to track targets, goals & objectives is a syntropic process -- teleological.
      Physicists ignore the mind, but all observers have a mind which is syntropic.
      The observed is dual to the observer -- David Bohm.
      Syntropy (Prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Main stream physics is currently dominated by teleophobia and eliminative materialism.
      Teleophilia is dual to teleophobia.
      New laws of physics based upon teleology do not go down to well in physics departments -- so you are unlikely to hear about this new law.
      "Philosophy is dead" -- Stephen Hawking.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      @@sethrenville798 All energy is dual -- electro is dual to magnetic -- Maxwell's equations.
      Null vectors or light rays are perpendicular or dual to themselves from our perspective -- self duality.
      The inner product is dual to the cross product -- Maxwell's equations.
      Positive is dual to negative -- electric charge.
      North poles are dual to south poles -- magnetic fields.
      Everything in physics is made out of energy (duality).
      The word "information" means something is being formed or created, synthesized -- a syntropic process!

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @CeeJay591
    @CeeJay591 Рік тому +4

    Arvin, I’ve watched many of your videos but this one was really amazing - thanks so much

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @ToddRickey
    @ToddRickey 10 місяців тому +1

    Brilliance, on conversationalist level with amazing truths!

  • @kanakTheGold
    @kanakTheGold Рік тому +1

    This approach at understanding Entropy and Time is in agreement with what we observe in universe and at same time, the simplest explanation I have seen so far and hence the best way of describing both entropy and time.

  • @pwnedd11
    @pwnedd11 11 місяців тому +3

    Wow, this video made so many things that I am currently looking into clear to me. And it has helped me know what I need to study next. Sorry for being so vague, but again... this is so helpful! Thank you so very much!!!

  • @themcchuck8400
    @themcchuck8400 Рік тому +4

    Completely wonderful video explaining the real meaning of entropy.
    It goes off the rails right at the end (and in the title) when he gets philosophical about time, getting things exactly backwards.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @mochiebellina8190
      @mochiebellina8190 Рік тому

      What?

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @dawid_dahl
    @dawid_dahl Рік тому +2

    Love these Arvin Ash videos so much! 🙏🏻

  • @brenlee9325
    @brenlee9325 11 місяців тому +2

    Thank you so much for making these concepts which always seem so difficult and mysterious, understandable.

  • @aaronaragon7838
    @aaronaragon7838 Рік тому +9

    This man really explains the nut n bolts of physics in a clear fashion.🎉🎊

    • @stevegalaxidas458
      @stevegalaxidas458 11 місяців тому

      Yes. Great explanation, but I now have a nagging suspision that we drilled down from one fundamental principle such as time or space time to the principle of probabilities. There should be a video on the concept of physical laws and what do really mean by that.

  • @davidchung1697
    @davidchung1697 Рік тому +3

    There is a simpler explanation of entropy. Consider a 3-particles in a box. If you put 2 particles abutting one another at the XYZ coordinate (0, 0, 0) (motionless) and then smash them with the third particle, the 3 particles will begin to bounce around in the box. For the 3 particle to return to their initial state, all 3 particles would have to retrace their motions - which is highly improbable. This irreversbility is exactly why entropy does not decrease.
    More generally, given a multi-particle closed system, the probability of the particles returning to their prior state requires that all the particles move in a particular manner in a concerted fashion - which are extremely low probability events.

    • @davidchung1697
      @davidchung1697 11 місяців тому

      Just to add a bit more - What the above thought experiment tells you is that the entropy is the consequence of having particles in space-time - it becomes obvious that time is NOT the consequence of the second law of thermodynamics.

  • @alighahramani1252
    @alighahramani1252 Рік тому +1

    OMG.
    This was great, man. Thank you🌺🌺

  • @milanocomprendo7318
    @milanocomprendo7318 8 місяців тому +1

    Thanks for the lovely and simple explanation. I can use this to explain to those I discuss this with.

  • @djfaber
    @djfaber Рік тому +8

    It's interesting that you mention time being a statistical exercise, as in software engineering where you control the universe (to a certain extent), the time keeping mechanisms were monotonic time and real time differ because in the case of a system which is too busy, it's entirely possible (and highly probable) that time can stick or go backwards and the monotonic timer was invented to address this statistical probability.

    • @jack.d7873
      @jack.d7873 Рік тому +1

      It's always fascinating to notice similarities in our computerised machinery and the machinery of the Universe. And the way our Universe works; Block time of Quantum Field excitations, is suspiciously similar to the way a super advanced computer system would operate.

  • @LoanwordEggcorn
    @LoanwordEggcorn 11 місяців тому +3

    Not sure if you're already done one (or several), but would be interesting to hear how entropy relates to quantum mechanics. Quantum mechanics also is about probabilities of different states, for example Feynman diagrams.

    • @user-sm9hh9hz8j
      @user-sm9hh9hz8j 11 місяців тому +1

      And also "Heisenberg Uncertainty Principle" .

  • @DrZedDrZedDrZed
    @DrZedDrZedDrZed Рік тому +2

    I’m really glad Arvin touched on the expansion of the universe in this video, it’s intractable linked to the concept of entropy in a counterintuitive way that most science communicators don’t talk about. It’s also important to note that, if we ONLY consider the contents of the universe at the Big Bang, without considering time, then the universe was exetremely homogenous, disordered, and random, there was no “didference” to be had to put it in a low entropy state, unless you consider the creation of space itself as giving that hot plasma more and more places to go; and doing so extremely quickly. This thinking leads to the best interpretation of entropy I’ve read to date. It’s less about order/disorder, or even the arrow of time. Entropy is the LOOSENING OF CONSTRAINTS. Hat tip to Terrence Deacon for that one. Anyone interested in this topic MIST read Incomplete Nature.

  • @philippefossier7178
    @philippefossier7178 10 місяців тому +2

    Excellent presentation as usual: clear and well explained by someone who is obviously excited about the material. Thank you Arvin. I love your videos about physics and chemistry.

  • @Giulio110199
    @Giulio110199 Рік тому +9

    My chemistry professor, who only used to read his slides underlying and circling basically everything, left me really dubious about entropy, I was like “yeah ok the logarithm, the disorder thing makes way more sense in some way”. It’s incredible how right now the logarithm n explains things so well and it makes such a wonderful sense, had I seen things like this before maybe I’d have enjoyed more chemistry or other courses entropy-related. Amazing video!!

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @chrisrace744
      @chrisrace744 Рік тому

      You were in the wrong class. Take physics.

    • @Giulio110199
      @Giulio110199 Рік тому

      @@chrisrace744 I know and I am in physics, it was a mandatory course for physics tho lol

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @SumitPrasaduniverse
    @SumitPrasaduniverse Рік тому +17

    Very excited to see Arvin's explanation on time & entropy 😃

  • @tinetannies4637
    @tinetannies4637 Рік тому +2

    I adore these videos, thanks so much Arvin

  • @robertbutwell5211
    @robertbutwell5211 3 місяці тому +1

    Well done. Thanks. Love your material.

  • @glaucosaraiva363
    @glaucosaraiva363 Рік тому +6

    The second law states that the entropy of a closed system tends to increase with time, implying a preferential direction for time. This tendency of increasing entropy is associated with the irreversibility of many physical processes, which may partially explain why time seems to flow in a specific direction

    • @drbuckley1
      @drbuckley1 Рік тому

      The entropy of an open system may be slowed but not reversed.

    • @glaucosaraiva363
      @glaucosaraiva363 Рік тому +3

      @@drbuckley1 local order increases at the expense of greater disorder in the surroundings

    • @luudest
      @luudest Рік тому +1

      What always has confused me with the term ‚arrow of time‘: While entropy appears to be very random at the microscopic level time seems to be very ‚precise‘ and not random. So what makes time so ‚precise‘?

    • @drbuckley1
      @drbuckley1 Рік тому +1

      @@glaucosaraiva363 That's what I intended to say, if by "local order" you mean an "open system," and by "surroundings" you mean a closed system. A "local order" may "slow" the rate of entropy but it cannot avoid increasing entropy, much less reverse it.

    • @drbuckley1
      @drbuckley1 Рік тому

      @@luudest "Time" is observer dependent. Your worldline is different from mine. "Now" is an illusion because humans cannot perceive the miniscule differences in their respective worldlines. These differences are imperceptible but not immeasurable.

  • @dougieh9676
    @dougieh9676 Рік тому +12

    Entropy is my favorite physics subject. The fact it’s linked to the arrow of time is profound and disturbing.

  • @nerdexproject
    @nerdexproject Рік тому +1

    Loved this video! Makes so much sense!

  • @shreeram5800
    @shreeram5800 Рік тому +2

    Super Concept, and Can't believe that it was simplified in a beautifully elegant way ☺️

  • @s1gne
    @s1gne Рік тому +4

    I love entropy, it's a great excuse for lots of things.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому +1

      ​@@hyperduality2838
      < reported for spam >

  • @orbitalengineer2800
    @orbitalengineer2800 11 місяців тому +4

    Does that mean that at any given moment, there is a non zero probability that the universe will wind up back to its orginal state of singularity?

    • @user-yp2cs4js3n
      @user-yp2cs4js3n 11 місяців тому

      You are right, pretty much the same way one divides by 0 for a derivative. It's not a number 0, just a very close number to it, and that doesn't really affect anything for a problem being solved. For example, people often say instantaneous velocity, but it doesn't really make sense actually. A car on a photograph doesn't move(just 1 frame, no time here), so for velocity we need this arbitrary small nudge on the time axis. Thus, instantaneous here means time window so small, that it could have been an instant for our problem. And when we can't choose this arbitrary small nudge anymore we get quantum mechanics. The same way improbable event here means that it just won't happen during a lifetime of universe(but the probability isn't zero).

  • @joevignolor4u949
    @joevignolor4u949 11 місяців тому +1

    The way I've heard this explained is that there are a very limited number of ways something can be put together while there are a lot of different ways it can break apart. A drinking glass sitting on a table is a good example. While it's sitting on the table the glass is put together a specific way. But if you knock it off the table it will shatter on the floor and end up a different way. Because there are an almost infinite number of different ways it can end up after it falls the chances of it ever coming back together again exactly like it was before are also infinitely small.

  • @johnvosarogo1785
    @johnvosarogo1785 11 місяців тому +1

    Some thoughts I've had regarding this topic: I think the best or most useful conceptualization of time is as a measurement of relative change. Anytime time is referenced change must always also be referenced. So change is fundamental to time and given a static closed system there can be no internal clock. An implication of this seems to be that once the universe evolves to it's final state of absolute heat death. Where any remaining undecayed particles are so isolated that they can never interact with each other and so for all intents and purposes there is no longer any relative change in the universe and all that's left is besides individual isolated particles is the quantum fluctuations of empty space. When the universe arrives at that state, time has essentially stopped since there are no longer any events by which to measure time. From that point on, anything that does happen could be said to happen instantly afterward. So, imagining an external clock, if it took a near infinite amount of time to elapse on that external clock for the quantum foam to produce a fluctuation that evolved into the next iteration of a material universe. From the internal perspective, lacking the concept of time following the heat death of the previous iteration, the emergence of the next could be said to have happened instantly following the end of time of the previous. Therefore it becomes easy to imagine that the near infinite amount of "time" required for the quantum foam to produce a fluctuation that will evolve into a material universe is actually no time at all. The ultimate implication of that being that not only is the emergence of a universe out of the quantum fluctuations of empty space probabilistically likely, it's inevitable, since whatever time is required is available and actually is relatively instantaneous from the perspective of a changeless universe. All of this however assumes eternally expanding space filled with quantum foam. How that originated, why there's even that as opposed to true nothingness? Probably has to do with probability and the uncertainty principle in a way thats difficult for me as a layperson to conceptualize or explain but I feel like I understand intuitively. True nothingness is a form of absolute certainty and the uncertainty principle is such that a state of absolute true nothingness must instantly and randomly evolve into a state with non zero energy. Or something like that. @ArvinAsh I'd love to know if you, or anyone else reading this, thinks any of it makes any sense lol 🙏🏽🙌🏽🌌

  • @0-by-1_Publishing_LLC
    @0-by-1_Publishing_LLC Рік тому +7

    *TIME* is a measurement of change and nothing more. *Example:* Let's say the only thing in existence was an unlit lightbulb. Without any observable change happening, it logically remains in a *timeless state.* The lightbulb would appear exactly the same, so no measurement of any time passing can take place. However, once the lightbulb turns on, then that represents a "change" and also the beginning of time.
    ... But even that is not enough to represent a measurement called "TIME."
    You would need a minimum of one other instance of change to assess how much time has passed. So, if the lightbulb turns itself back off, then the amount of TIME between the two instances of change can be measured.

    • @CarlosElio82
      @CarlosElio82 Рік тому +1

      Hence, you need two different events, on and off. Who is the agent who flips the switch? In math, Peano uses 0 and 1 to create numbers, similarly light needs electricity and magnetism to exists and propagate. In all cases, an agent performing the transformation is needed. In Peano is the "successor" operation and in light we have fields. Leibniz tried hard to find monads but found them not.

    • @ArvinAsh
      @ArvinAsh  Рік тому +4

      It is possible. The main argument against this would be that in General Relativity one can define a perfectly good spacetime without the need for any mass or object at all. So time would still exist in mass-less or matter-less universe.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC Рік тому +1

      @@ArvinAsh *"The main argument against this would be that in General Relativity one can define a perfectly good spacetime without the need for any mass or object at all. So time would still exist in mass-less or matter-less universe."*
      ... I understand your reasoning, but I argue that time is not dependent on any type of structure and that any "theoretical" state where a change is taking place can be chronicled via time. If time is indeed a "measurement of change," then it can be applied to all aspects of existence (even nonphysical abstractions)
      *Example:* The numbers "0" and "1" have no material structure, but should a 0 change to a number 1, then this represents a change. Should this number 1 change back to 0, then the duration of this mathematical event can be measured via time.
      ... Excellent video, as always!

    • @timhaldane7588
      @timhaldane7588 Рік тому +1

      @@ArvinAsh I go back and forth on this. Logically, I just don't see how it's possible to define time independently of a change between two or more distinct states, just like I don't see how it's possible to define space except as a relationship between two or more distinct points. A dimension without a metric is undefined at best, meaningless at worst, right? But it's also hard to deny that GR and QFT, two of our most successful models of the universe, both seem to imply that space itself has a kind of independent existence of its own. 20th century science sure seemed to nod slyly in the direction of block spacetime.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC Рік тому

      @@timhaldane7588 *"Logically, I just don't see how it's possible to define time independently of a change between two or more distinct states, just like I don't see how it's possible to define space except as a relationship between two or more distinct points."*
      ... I'm in the same camp. Modern science seems to want to attach as many properties as possible to easily explained phenomena. Push that mindset far enough and we end up with a new religion!

  • @mr.cosmos5199
    @mr.cosmos5199 Рік тому +3

    You just mean there are more ways to be disorganized than being organized,right?
    Therefore it’s more probable to be disorganized.
    Great video ❤

    • @thedeemon
      @thedeemon Рік тому +1

      The thing is: such macro states are only called "disorganized" because there are more microstates in them, i.e. we know less about which state the system is in exactly if we know it's in this macro state. There are more ways to be disorganized because states where there are more ways to be are called disorganized.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @UnitSe7en
      @UnitSe7en Рік тому

      Makes me feel better about the state of my room.

    • @UnitSe7en
      @UnitSe7en Рік тому

      @@hyperduality2838 IS a list of descriptive definitions the only thing you have to say? It's totally non-relevant in every comment you've posted it in. (And there's been a lot)

    • @hyperduality2838
      @hyperduality2838 Рік тому

      @@UnitSe7en Male is dual to female synthesizes children or offspring -- the Hegelian dialectic.
      The double helix should actually be called the dual helix as the code of life is dual -- two strands.
      Hydrophilic is dual to hydrophobic -- hydrogen bonding in DNA.
      A is dual to T.
      C is dual to G -- base pairs.
      The Penrose tribar:-
      ua-cam.com/video/l1PCE47jwng/v-deo.html
      The Penrose tribar is only consistent from two antipodal points or perspectives -- antipodal points identify for the rotation group SO(3).
      Gluons are force carriers as they attract and repel quarks:- attraction is dual to repulsion -- forces are dual.
      Action is dual to reaction -- Sir Isaac Newton or the duality of force.
      Push is dual to pull, stretch is dual to squeeze -- forces are dual.
      Protons (positive) are dual to electrons (negative) synthesize photons (neutral, neutrons) or pure energy.
      Duality (thesis, anti-thesis) synthesizes reality or non duality -- the time independent Hegelian dialectic.
      Stability is dual to instability -- optimized control theory.
      Space is dual to time -- Einstein, space is actually 4 dimensional.
      Waves are dual to particles -- quantum duality.
      The rule of two -- Darth Bane, Sith Lord.
      Subgroups are dual to subfields -- the Galois correspondence.
      Addition is dual to subtraction (additive inverses) -- abstract algebra.
      Multiplication is dual to division (multiplicative inverses) -- abstract algebra.
      The Penrose tribar is an impossible construction but consistent from only a dual (antipodal) perspective.
      Gravitation is equivalent or dual (isomorphic) to acceleration -- Einstein's happiest thought, the principle of equivalence, duality.
      Once you take duality seriously you can create new laws of physics, syntropy is dual to entropy -- the 4th law of thermodynamics!
      You are built from DNA (duality).
      Energy is duality, duality is energy -- waves, particles.

  • @nicholasmccue2726
    @nicholasmccue2726 11 місяців тому

    Thank you for your work Arvin

  • @jeronimoolivavelez1299
    @jeronimoolivavelez1299 Рік тому +1

    Great explanation as always.
    So in other words when we are sleeping we are: “at rest” low energy status. The “at rest” energy is transformed-spread-transport to the bed, increasing the energy of the bed. The bed is working good and at a High energy state.

  • @mysterio7807
    @mysterio7807 Рік тому +3

    You know just commenting from an old video of yours regarding the Eye Of Agamotto on the MCU. Like entropy only increases in time, seemed to me watching Doctor Strange last night the Eye Of Agamotto reverses the entropy. How it does that? No idea. But the MCU used to be very clever with its scientific fiction until it became all over the place in Endgame.

    • @ArvinAsh
      @ArvinAsh  Рік тому +3

      Yes, that's why I don't really enjoy those marvel movies so much. For a nerd like me, I just see all kinds of scientific flaws instead of enjoying the action! lol. Oh the agony of being a nerd!

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      @@ArvinAsh Action is dual to reaction -- Sir Isaac Newton or the duality of force.
      Attraction is dual to repulsion, push is dual to pull, stretch is dual to squeeze -- forces are dual.
      Positive is dual to negative -- the electromagnetic force field is dual.

  • @KatjaTgirl
    @KatjaTgirl Рік тому +8

    Another great video Arvin, thanks!
    Since our universe is expanding and the speed of light is limited, in the future we will be able to interact with fewer and fewer particles. Does this mean that in the far future when our light cone gets emptier, entropy will actually go down while time is still moving forward? If so what is the moment of maximum entropy of our universe?

    • @tektrixter
      @tektrixter Рік тому +6

      That concept is called "heat death". Given unlimited expansion of the universe, eventually all remaining particles will be evenly distributed and outside one another's light cones. At that point time itself may end as without interaction there can be no events to have intervals between. Entropy will be maximized.

    • @the6millionliraman
      @the6millionliraman Рік тому +3

      That's a really interesting question imo.
      As Roger Penrose has theorized, the universe's remote future of maximum entropy (heat death) is, at least mathematically speaking, fundamentally indistinguishable from the minimum entropy state of the universe at the Big Bang.
      Both instances are in thermal equilibrium.
      So it's almost like the maximum entropy state is (mathematically) exactly the same as the minimum entropy state.
      Penrose posits that in the remote future, after black holes have evaporated and there are just massless and timeless photons bombing around, the universe basically "forgets" how big it is and in some way returns to its Big Bang state. He uses MC Escher's fractals as an analogy. Fascinating. Who knows.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      @@the6millionliraman Maximum is dual to minimum.

  • @jyotismoykalita
    @jyotismoykalita Рік тому +2

    This has to be the best explanation on entropy ever.

  • @daniloonuk
    @daniloonuk Рік тому +1

    Great one, heard somewhere that entropy is hidden information, this explenation told me that increasing entropy means a lot of giving up.

  • @0-by-1_Publishing_LLC
    @0-by-1_Publishing_LLC Рік тому +2

    I equally argue that time cannot be reversed and can only move forward. In the "scrambled eggs" example even if the entropy reversed itself, there would still be a *beginning* (the egg), a *middle* (the egg being scrambled) and an *end* (the egg returning to its original state of entropy).
    *The Past* is a database of all events that take place.
    *The Present* is now.
    *The Future* is a specific degree of probability based on past and present events.

    • @AdvaiticOneness1
      @AdvaiticOneness1 Рік тому +2

      It's always Now, Past and Future are just illusions.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC Рік тому +1

      @@AdvaiticOneness1 *"It's always Now, Past and Future are just illusions."*
      ... That is a *semantic paradox* resulting from the words we are forced to use to communicate. However, your paradox has no bearing on reality. There is, was, and always will be a past, present, and a future.

    • @drbuckley1
      @drbuckley1 Рік тому

      Minkowski proposed the existence of hyperspace, a region of "now" that cannot be perceived by any observer. "Now" may be the illusion, since no two observers share the same "worldline."

    • @AdvaiticOneness1
      @AdvaiticOneness1 Рік тому +1

      ​@@0-by-1_Publishing_LLC Our perception of time as a continuous flow is created by our brain's processing of sensory information and memories. Time is not an objective feature of the universe, but rather a product of our consciousness.

    • @0-by-1_Publishing_LLC
      @0-by-1_Publishing_LLC Рік тому

      @@AdvaiticOneness1 *"Our perception of time as a continuous flow is created by our brain's processing of sensory information and memories."*
      ... Yes, and "brains" extrapolate *logic* from whatever we observe in a universe steeped in logic, and logic states that all events that have already taken place are recorded in the past; now is an immeasurable instant within the present, and the future is only a specific degree of probability based on data derived from past and present events.
      *"Time is not an objective feature of the universe, but rather a product of our consciousness."*
      ... Like it or not, our individual consciousnesses are an equal part of the universe ... just like everything else. We don't marginalize our ability to think, conceive, and process logic just because we are the ones wielding the power to do so.

  • @rafanifischer3152
    @rafanifischer3152 Рік тому +24

    I am an expert on probability. And I will now enlighten you: Never go on vacation to Las Vegas. You can thank me later.

    • @ArvinAsh
      @ArvinAsh  Рік тому +1

      Wise words! But the shows are nice.

    • @jasonspades1265
      @jasonspades1265 5 місяців тому +1

      Then you know playing blackjack isn't too much of a risk if you know what you're doing

    • @rafanifischer3152
      @rafanifischer3152 5 місяців тому

      Flipping a coin is not too much of a risk but I wouldn't bet my house on it.@@jasonspades1265

    • @baxakk7374
      @baxakk7374 3 місяці тому +1

      Skilled poker players against stupid people on vacation could win

    • @caveman3592
      @caveman3592 2 місяці тому

      😂

  • @abhishekc232
    @abhishekc232 Рік тому +1

    One of the best video on entropy.

  • @ANGROCEL
    @ANGROCEL Рік тому +2

    I had to learn entropy and equilibrium last month. I like how you explained it.

    • @evgenistarikov3386
      @evgenistarikov3386 11 місяців тому +1

      But to really learn what entropy is in fact, please, cf. my commentary to this stream. Please, mind that any Equilibrium is basically a result of the relevant Actions-Counteractions interplay.

  • @alfadog67
    @alfadog67 Рік тому +6

    OUTSTANDING! Thanks, Professor Ash!
    If entropy is increasing, does that mean the speed of time is also decreasing? Could that look like universal expansion?

    • @ArvinAsh
      @ArvinAsh  Рік тому +9

      Well, I'm not sure what speed of time means. There is nothing like a standard time in the universe. Clocks tick faster of slower depending on your reference frame compared to other reference frames. This is what we learned from Special Relativity.

    • @alfadog67
      @alfadog67 Рік тому

      @@ArvinAsh The way I'm visualizing it is that before, when there was less entropy, a second was a second, and today with our current entropy, a second is still a second... but when we compare the two, today's second is relatively longer than before.

    • @drbuckley1
      @drbuckley1 Рік тому +1

      @@alfadog67 Entropy increases in closed systems, but may be slowed (if not reversed) in open systems. The Earth is an open system, deriving most of its energy from the Sun. Entropy results in the Arrow of Time; the experience of time is observer dependent.

    • @iam6424
      @iam6424 Рік тому

      Probably, since it all came out as a probability phenomenon !
      But I wonder ,what is "Probability " ?

    • @drbuckley1
      @drbuckley1 Рік тому

      @@iam6424 "Probability" is a human construct intended to approximate their expectations. It is not reality, which could not care less about what humans expect.

  • @anthonycarbone3826
    @anthonycarbone3826 Рік тому +5

    I think the real question is how did everything get to such a high energy state in the first place. I could sum it up better by why did the universe have so much order to begin with when the original universe was full of disorder and only shortly thereafter the beginning coalesced into an ordered meaningful universe..

    • @anthonycarbone3826
      @anthonycarbone3826 Рік тому +3

      Arvin answered my question very well. But still it boggles the mind to think the early universe was more orderly when no atom could exist until the universe could cool down enough for photons to go their own way and allow for atoms to form.

    • @vitovittucci9801
      @vitovittucci9801 Рік тому

      Just before the Big Bang all matter and energy were compressed in an extremely small space. Because of Incertiytude Principle all particles (whatever they may be) were arranged in relatively precise position(low entropy)

    • @anthonycarbone3826
      @anthonycarbone3826 Рік тому +1

      @@vitovittucci9801 There was no matter or energy before the big bang. To talk about anything before the Big Bang goes into the metaphysical realm and into the complete unknown. You mean afterwards up to the time the universe cooled enough for matter to form.

    • @hyperduality2838
      @hyperduality2838 Рік тому +1

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @10002One
      @10002One 11 місяців тому

      @@hyperduality2838 ugh! you again. you must be a Daoist with all of this yin-yang duality. will risk posing a question -- if energy is duality, and synergy is dual to energy, where is symmetry? don't just say that it is dual to dissymmetry. ☯

  • @ajayshinde1571
    @ajayshinde1571 Рік тому +1

    really impressive video .. the best explanation I have ever seen

  • @danielduarte5073
    @danielduarte5073 Рік тому

    Great information. Well done!

  • @mmtrichey
    @mmtrichey Рік тому +16

    34 People liked it before it even played. Well done Arvin! :-)

    • @stefaniasmanio5857
      @stefaniasmanio5857 Рік тому +4

      I understand this is not scientific at all... But Arvin is Arvin.... ❤

    • @timhaldane7588
      @timhaldane7588 Рік тому +3

      Retrocausality.

    • @tayt_
      @tayt_ Рік тому

      Reported for hacking.

    • @tonalambiguity3345
      @tonalambiguity3345 Рік тому +1

      Probably had an early release for his patreon or something

    • @dsera2721
      @dsera2721 Рік тому

      you mean hes paying for views?

  • @kirksneckchop7873
    @kirksneckchop7873 Рік тому +7

    Your team did a great job! These concepts are quite complicated and nuanced. While some of the ideas are elementary (i.e., the concept of a state space), others you might only see as a graduate student (e.g., fluctuation theorems in statistical mechanics).

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

    • @NoActuallyGo-KCUF-Yourself
      @NoActuallyGo-KCUF-Yourself Рік тому

      ​@@hyperduality2838
      < reported for spam >

  • @AlexthunderGnum
    @AlexthunderGnum Рік тому +1

    Great video. Thank you! One remark from me - energy being more or less useful, or able to do work, is very Anthropocentric definition. It implies "useful to me" or "able to work for me".

    • @ArvinAsh
      @ArvinAsh  Рік тому +1

      It is ability to be transformed into other forms of energy, particularly kinetic energy. Gravitational potential energy can be transformed into other forms.

    • @AlexthunderGnum
      @AlexthunderGnum Рік тому

      @@ArvinAsh Energy in any form can be transformed to another form, given circumstances, no?

  • @Rationalific
    @Rationalific Рік тому +1

    Thanks for another informative video!

  • @davidgracely7122
    @davidgracely7122 7 місяців тому +3

    Very well presented. It was very important that this video pointed out that the words "entropy" and "disorder" are interchangeable in meaning.
    The discovery that the disorganization of a system is directly related to the probability of such a system coming into existence by random processes was a great breakthrough. And it gives the formula for how to calculate this probability. It not only shows what chemical and physical reactions will likely take place spontaneously under normal conditions, but also shows us that the overall entropy or disorder in the universe is increasing with time.
    However, Boltzmann's formula is a description of what is observed. The idea that probability is the causative factor in the second law of thermodynamics is a philosophical position. An equation describes what is observed and what to expect under certain conditions. It does not tell us why such a law in our natural world exists, nor why it follows the mathematical pattern that it does, nor by what means such a law came into existence.
    "The heavens shall wax old as doth a garment". This was declared in Scripture long ago. This increase in disorder, as a layman like myself understands it, is an aging process, and our common everyday observations of the world around us shows that it is true for both living and non-living systems.
    If you were to start a plum bob swinging back and forth, you would notice that each swing was becoming less and less. Even if you were to put the pendulum underneath an evacuated bell jar, the dying away of the motion, while taking place at a slower rate, would still be observable. In like manner, if you took an original photo and made multiple copies of it, each copy being made from the proceeding copy, it would be noticed that the copies would look worse and worse the further away you got from the original.
    The second law of thermodynamics shows us that nothing in our universe works with 100% efficiency. Energy is expelled from a process in the form of heat, which my physics textbook calls "disordered energy". While the total amount of energy is still the same, it has been spread out into other less usable forms just as explained in this video. That is why it is impossible to create a perpetual motion machine.
    What are all living things physically speaking? They are genetic copies of copies of copies going back further and further in time. The second law of thermodynamics makes a prediction---namely that the genetic copying process is not going to take place with 100% efficiency and that this inefficiency of genetic transfer from generation to generation has to show up somehow. And it does. As detrimental mutations. As to any supposedly beneficial change, it is necessary to be able to rule out the possibility that this is not a reverting back to a previously undamaged condition, especially since there is a backup to the genetic code which is incorporated into the cell with a computer-like sum check process kicking in to minimize the harmful effects of damaged genes being transmitted to the next generation of living things. This shows that the gene pools of all living things in our universe are aging.
    The theory of evolution should have been discarded when the mathematical understanding of the second law of thermodynamics was discovered. The fact that it hasn't been is a testament to the fact the scientific world is not as objective as advertised.

  • @faikerdogan2802
    @faikerdogan2802 Рік тому +1

    God tier video. So many videos i watched about entropy and this one is just the best

  • @duncanfankboner1319
    @duncanfankboner1319 Рік тому +2

    Things don't change, they just move from the present into the past; what we call time is that movement.

  • @shethtejas104
    @shethtejas104 Рік тому +1

    Many thanks Arvin 👍👍Kudos for thinking up the pencil and table example. Its simplicity allows viewers to get the not-so-simple concept of entropy. If the expansion of the universe can be explained by its natural tendency to find more ways to distribute energy, why are looking for the mysterious dark energy? And also, why does this tendency not act at smaller scales. For example, our planet could expand while conserving energy and therefore exist in a higher entropy state. Finally, on a related note, it would be interesting to think about gravity as an enemy of entropy. At interstellar scale, it doesn't allow the expansion to occur. But at intergalactic scales, somehow entropy is taking over (the tendency to expand and find more ways to distribute energy).
    You did a damn good job as always. You got me thinking, wondering and asking questions!

  • @binbots
    @binbots Рік тому +2

    The arrow of time points forward in time because of the wave function collapse. Because causality has a speed limit (c) every point in space that you observe it from will appear to be the closest to the present moment. When we look out into the universe we see the past which is made of particles (GR). When we try to look at smaller and smaller sizes and distances, we are actually looking closer and closer to the present moment (QM). The wave property of particles appears when we start looking into the future of that particle. It is a probability wave because the future is probabilistic. Wave function collapse happens when we bring a particle into the present/past. GR is making measurements in the predictable past. QM is trying to make measurements of the probabilistic future.

    • @timhaldane7588
      @timhaldane7588 Рік тому +1

      I tend to lean toward a similar explanation, myself. Wave function collapse / decoherence seems to amount to some kind of dispersion of information into the environment whenever any previously independent particles interact. I picture this collapse as an exchange of information, sent and received in every temporal direction along the particle's worldline. Think of it like nature "error checking" to make sure it's behaving consistently and coherently. Since we are macroscopic objects, caught up in the flow of time toward the future, the past looks fixed to us while the future appears indeterminate, but in a sense, the future and past are both fixed AND contingent. Contingent in the sense that everything depends on everything else. Fixed in the superderministic sense that a spacetime-wide web of contingencies wouldn't leave much room to escape determinism.

    • @wheelswheels9199
      @wheelswheels9199 Рік тому

      There is no evidence that then wave function is even a real thing let alone that a collapse actually happens. It’s a model that gives us some results that agree with observations.

    • @hyperduality2838
      @hyperduality2838 Рік тому

      Randomness (uncertainty, entropy) is dual to order (certainty, syntropy).
      Certainty is dual to uncertainty -- the Heisenberg certainty/uncertainty principle.
      Signals (patterns, syntropy) are dual to noise (entropy) -- the signal to noise ratio in electronics.
      "Entropy is a measure of randomness" -- Roger Penrose.
      Syntropy is a measure of order.
      Repetition (patterns, order, syntropy) is dual to variation (disorder) -- music.
      Increasing the number of dimensions or states is an entropic process -- co-homology.
      Decreasing the number of dimensions or states is a syntropic process -- homology.
      Increasing is dual to decreasing.
      Homology (syntropy, convergence) is dual to co-homology (entropy, divergence).
      Syntropy (prediction) is dual to increasing entropy -- the 4th law of thermodynamics!
      Teleological physics (syntropy) is dual to non-teleological physics (entropy).
      Potential energy is dual to kinetic energy -- gravitational energy is dual.
      Synergy is dual to energy -- energy is dual.
      Energy is duality, duality is energy.
      The conservation of duality (energy) will be known as the 5th law of thermodynamics!
      Energy is dual to mass -- Einstein.
      Dark energy is dual to dark matter -- the universe is dual.
      "Always two there are" -- Yoda.
      Integration (syntropy, summation) is dual to differentiation (entropy, differences) - abstract algebra.
      The 4th law of thermodynamics is hardwired into mathematics & mathematical thinking.

  • @GururajBN
    @GururajBN Рік тому +2

    Entropy plays a role on my work table too. If I do not periodically rearrange the things, they become so mixed up and chaotic!

  • @webx135
    @webx135 Рік тому +2

    One way this could be worded, is that all possible configurations are equally likely. There are so many fewer configurations that are the ones you would be looking for.
    So if you roll a 20-sided dice, and are hoping for a 20. The piece educators tend to miss is that when they explain it, it often sounds like a weighted die. But rather, it's the fact that you are hoping for a 20, and there are 19 other equally-likely states that AREN'T 20. These other states aren't more likely than the 20.
    So with the scrambled egg analogy, it's not like the "unscrambled" state is less likely than, say, a very, very SPECIFIC way for the egg to be scrambled. It's just that we define "unscrambled" in such a specific way that it is absurdly unlikely to ever be in that state. "What if I rolled a 20?"
    You could also ask "What are the chances the egg would be scrambled in a way that the yolk spells out the complete works of Shakespeare?". There are an absurd number of combinations in total. So the likelihood of that specific state is insanely low. This would be like asking "What if I rolled a 19?"
    Or you could ask "What are the chances that the egg would be about halfway scrambled", and now you've included a TON of states that would fit this description, so it's pretty likely to hit one of them, even if each state individually isn't more likely than the others. This would be like asking "What if I rolled something higher than 10?
    The likelihood of the states doesn't change. The number of combinations you choose is what changes.

  • @williammartine5168
    @williammartine5168 Рік тому +2

    well done and well handled. Entropy changes can be quite deceptive, like living things developing---growing into an organized entity from the disorganized surroundings. The time/entropy connection has always interested me, I wonder if time and entropy are equivalent, when viewed from a higher dimension. Then there is the concept that time is only an illusion---well, it would seem that entropy can govern causality with as much direction as the apparent 'arrow of time'. Thanks for all your awesome work on the videos.

  • @andyonions7864
    @andyonions7864 Рік тому +2

    Loved the 2D animations of Hydrogen and Oxygen combining, showing clearly the incomplete hydrogen shell only having 1 of 2 electrons and the oxygen having two shells, innermost complete with 2 electrons and outer shell missing 2 electrons (having 6out of 8). Then after the combination, both hydrogens share 2 electron sand the oxygen shares 8 in their outer shells. Just as I learnt at school all those years ago...

  • @emergentform1188
    @emergentform1188 Рік тому +1

    Wow amazing, another home run Arvin woohoo!

  • @MrKelaher
    @MrKelaher Рік тому +1

    My take. Entropy is all about "summary states" - states you can not, or chose not to, distinguish at some degree of "resolution". ALL observations happen at a distance and/or with fundamental limits on precision so there are fundamentally more states that look the same, ie their fields have the same influence on an observer frame at a distance, than ones that look unique as a remote frame evolves kinematically.

  • @Ladoyar77
    @Ladoyar77 Рік тому +1

    Great explanation!

  • @mikegLXIVMM
    @mikegLXIVMM 11 місяців тому

    Well explained thanks!

  • @f.jansen2892
    @f.jansen2892 11 місяців тому +1

    Really great video!

  • @Hossak
    @Hossak Рік тому +1

    I felt like standing up and applauding after this video. Thank you so much, my friend :)

  • @kushrungta4025
    @kushrungta4025 7 місяців тому +1

    something that i believe is that time can still not be defined by entropy because as you talked about change. Change in itself can't exist without time and entropy as well increases with time and not the other way !

  • @chrisjager5370
    @chrisjager5370 13 днів тому

    Another fun thing about entropy, is that information has the exact same equation as entropy, so you could say that the 2nd Law of Thermodynamics is that the universe's information content always increases -- but this version of information (Shannon entropy) doesn't mean the information is useful, only that low entropy stuff can be described with less bits.
    So for example 1000 coins set to heads is easy to describe, if you shake it up eventually you need to read out the whole sequence to describe it, or if someone flipped them around to write a binary message in then the longer the message the more random they would look.

  • @toastedsniper9248
    @toastedsniper9248 Рік тому +1

    Thank you! That's so cool.

  • @steveDC51
    @steveDC51 Рік тому

    Another good one Arvin.

  • @lukedowneslukedownes5900
    @lukedowneslukedownes5900 Рік тому +1

    The very short summary or the answer to this is that: everything is connected to each other through infinite time and space, which means every piece of matter, has some sort of relative effect to alter another piece of matter, and is exponential except at the quantum scale from what we know in science so far

  • @spinninglink
    @spinninglink Рік тому

    Whoa, never heard it described this way, very interesting to think about and consider.

  • @kennethbaird968
    @kennethbaird968 11 місяців тому +1

    Thank you time has alway puzzled me. How everything seem to exist and react in a predictable way at a certain time. And we can observe everything thanks to time( but it is not time it is entropy we are actually observing )

  • @LuigiSimoncini
    @LuigiSimoncini Рік тому +1

    The statistics explanation (# of configurations before and after the energy change) is spot on, the concept of "useful" used in the previous explanation is... not useful given that the definition of the meaning of usefulness depends on the observer

  • @LQhristian
    @LQhristian Рік тому +2

    It would seem more intuitive to correlate entropy with 'decay.' I.e.: The higher the energy level of the object/system, the lower its entropy/decay rate. This would explain why time would appear slower, the higher the dimension (with higher mass particles, stronger gravity). Just a thought :-).

  • @OmEnOriginals
    @OmEnOriginals 8 місяців тому +2

    You're a great man..I'm one of your very earlier followers

    • @ArvinAsh
      @ArvinAsh  8 місяців тому +1

      Awesome! Thank you!

    • @OmEnOriginals
      @OmEnOriginals 8 місяців тому

      @@ArvinAsh You're warmly welcome Sir 🙏

  • @paulhofmann3798
    @paulhofmann3798 8 місяців тому +1

    Nice explanation of entropy and maybe intuitive for some people, connecting entropy to probability of states as Boltzmann has done first. In this paradigm though we cannot explain the Saturn rings for example. But one can do better. One can go beyond probability of states to measure theory to quantify chaos. Kolmogorov has used measure theory to define Kolmogorov complexity which has Boltzmann entropy as a special case in its belly. Kolmogorov complexity and the KAM theorem are able to explain why there are unstable chaotic orbits of the Saturn rings between the stable orbits. Further, classically there is no explanation for the second law of thermodynamics. It’s just an unexplainable law. Roger Penrose claims the fact that entropy grows in the universe stems from a symmetry breaking at the Big Bang. Other universes may thus not have a second law of thermodynamics. By the way, there are ways where two liquids unmix, eg the Belousov Zhabotinsky reaction. I loved the look on the face of the students when I showed it during my lectures. There is more to physics than equilibrium Boltzmann type thermodynamics. Think of your pencil, or the physical pendulum. They show bifurcation and chaotic behavior.

  • @nickharrison3748
    @nickharrison3748 Рік тому +1

    very nicely explained

  • @Yasmin-pi5pr
    @Yasmin-pi5pr 8 місяців тому +2

    So, if this "force" (or 2nd law of thermodynamics) is the tendency to increase entropy, transforming energy into one that has less potential to do work, what is the opposite?
    What's the force that concentrates energy and increases its potential to do work?
    I once read that life is the opposite of entropy. The symmetry of there being 2 opposing forces makes some scientific sense.
    Thank you very much, your explanations is simply perfect.