Entropy is not what you think!

Поділитися
Вставка
  • Опубліковано 4 жов 2024

КОМЕНТАРІ • 118

  • @tenns
    @tenns 10 місяців тому +11

    that marble fortune teller bit was amazing, and beautifully made

  • @qookiemonsta2557
    @qookiemonsta2557 10 місяців тому +20

    your visualisations and examples are carefully thought out, which make it a profound joy to watch your videos!

  • @kat13man
    @kat13man 8 місяців тому +3

    Thank you for this brilliant visualization of entropy. You have truly done a great service in taking the time to show us how Boltzmann's definition of entropy works. And while entropy might be an artifact of how scientists looked at things, it does not diminish the incredible insight and expansion of understanding of our universe that an equation for entropy does for science though the use of mathematics. Manipulation of the entropy equation reveals unseen secrets. Then, there is Boltzmann's constant. The mathematical smoking gun that the Universe was designed. Can you explain why this constant keeps appearing over and over again. It is more than a mathematical anomaly because we can measure it using a gas at a constant temperature because pressure times volume is always the same. Entropy appears to be a way of looking at things that provides deep inside into the working of the Universe. Also, in information theory, those marble configurations with low entropy are very valuable. Entropy goes everywhere.

    • @MarbleScience
      @MarbleScience  8 місяців тому +3

      Thanks! Actually there is nothing magical about Boltzmann's constant. It is just a matter of units. The, sort of "natural", unit of entropy is given by the chosen logarithm. If you choose the logarithm with a base of two then the resulting unit of entropy is "bit", which is popular in computer science. If you choose the logarithm with base e then the unit is "nit". Now the problem is that historically, we (kind of arbitrarily) chose units for energy, temperature and entropy before we understood the statistical nature behind it. Now we have to multiply with Boltzmann's constant every time entropy pops up to convert it into these historically grown units.

  • @sebastiandierks7919
    @sebastiandierks7919 10 місяців тому +14

    Interesting to compare physicists with fortune tellers, of all xD Certainly provocative, but of course the metaphore is well phrased and explained. Nice playful video on the fundamental idea of statistical mechanics.

  • @u.s.navy_pete4111
    @u.s.navy_pete4111 10 місяців тому +9

    You are an admirable teacher and presenter. Keep it up!

  • @datre8256
    @datre8256 10 місяців тому +6

    Really nice video! I haven't thought about entropy in this way.

  • @amarnamarpan
    @amarnamarpan 7 місяців тому +1

    The definition entropy depends upon the intelligence of the observer. it's not just about probability. It is about computability. We talk probability only when all the numerous microstates are either hard to measure or even if we do measure it is still hard to use the measurement to predict the future using those.

  • @adelelopez1246
    @adelelopez1246 10 місяців тому +2

    Excellent explanation! It's a difficult concept to understand correctly, and so it's really great seeing an accessible explanation of it that actually gets it right!

  • @RAyLV17
    @RAyLV17 10 місяців тому +4

    Didn't realize that I was already subbed to this channel. I just got a notification for this vid, and wow what a wonderful video this was! If I could, I'd subscribe again! :p

    • @MarbleScience
      @MarbleScience  10 місяців тому +3

      Uploading a video is always a bit of anxious moment. Your comment is pure relief :) Thanks for the lovely comment!

  • @fanir33
    @fanir33 6 місяців тому

    This channel is pure gold

  • @HyattPan
    @HyattPan 2 місяці тому

    If one day we have WW3 or in a battle field, I wish you were in my team. You seems knows some stuff and sharp inside. And this video is the best Entropy introduce video. The animation and the marble model box is superb. You can just sell the marble box set. I believe every people who watch this clip would want to have one. Those can reminds us how this world have become what we have now. Thanks again.

    • @MarbleScience
      @MarbleScience  2 місяці тому

      Let's just hope it doesn't come to that :D You can 3d-print the box if you like: www.thingiverse.com/thing:6361546

  • @HuLi-iota
    @HuLi-iota 6 місяців тому

    I wish you got a video to explain how you did this visualisation and how time-consuming it was, respect!

  • @srijantiwari8152
    @srijantiwari8152 10 місяців тому +1

    Exceptional video!!

  • @jph8220
    @jph8220 9 місяців тому +1

    I happened upon your channel a couple of years ago when looking for teaching inspiration, and vaguely remember a video where you were considering funding or some kind of full time job on the platform; I hope you found success! :) You clearly put a lot of work/effort into this and you're doing a phenomenal job. The fortunate teller comparison is quite an amusing one that I think works well.
    If you're looking to appeal to a general audience for science outreach, the only thing I would say is that maybe don't include the equation for entropy in the video and just emphasize it's directly related to the number of microstates (kB and ln don't change anything on a conceptual level and IMO would take away from the video for the less mathematically-oriented who don't know what either of these things are).

    • @MarbleScience
      @MarbleScience  9 місяців тому +1

      Thanks :)
      I agree that kb and the logarithm are not really important here, but some still might prefer to see the concrete equation. The good thing about UA-cam is that you get data to analyze these kinds of things. As a creator e.g. you get a retention graph showing how much all parts of the video are watched. If a lot of people were strongly confused by the formula and stopped watching, I would see a drop of in the retention after showing the equation. That is not the case. Actually, there is a small peak in the retention of people going back to the formula to look at it a second time.

    • @jph8220
      @jph8220 9 місяців тому

      @@MarbleScience Cool, I don't create videos so I had no idea it had those sorts of analytics!

  • @blackskyy3274
    @blackskyy3274 10 місяців тому +2

    This is an amazing video! Thank you :)

  • @Eastern1
    @Eastern1 9 місяців тому

    Thank god you are back

  • @galaxyvita2045
    @galaxyvita2045 9 місяців тому

    I think one of the most interesting things about entropy is that we use it to diffuse the arrow of time. All other physical laws are symmetric when it comes to going forward and backwards in time. Entropy can be seen as a way to predict to which part of phase space a system will go over time.
    Also an other interesting thing is that there are multiple definitions of entropy depending on the field in which you are working, compare science, math, biology, ... they all have different definitions which are all related but we don't know the big underlying idea yet.

  • @kgblankinship
    @kgblankinship 10 місяців тому

    This is a very creative, insightful, and well presented description of physics.

  • @crawkn
    @crawkn 10 місяців тому

    The study of entropy emerged from the study of the behavior of gasses in enclosed spaces. It was useful in that context, but I had come to believe that the sorts of things people claimed entropy meant for the future of the universe were exceeding the power of any principles of entropy to predict. However, the perspective that it is in fact not predicting anything very specific, perhaps changes it from wrong to irrelevant.

  • @joshuat6124
    @joshuat6124 9 місяців тому

    Nice video, as someone with a PhD in atomistic simulation it's nice to revisit these concepts :) I highly recommend the series 'Order and Disorder' by Jim Al-Khalili for people interested in the story of how energy and entropy emerges as concepts.

  • @50secs
    @50secs 9 місяців тому

    Beautiful video, wish you all the best.
    Your explanation really added another dimentions to my understanding of generalisation through entropy.

  • @tecno_andre2752
    @tecno_andre2752 10 місяців тому +3

    a very epic video

  • @gabrielstahl5629
    @gabrielstahl5629 Місяць тому

    Great video, thanks a lot!

  • @FleshgodImmolation
    @FleshgodImmolation 4 місяці тому

    Amazing video! I would love to see a video on free energies

  • @vishalbhoknal2729
    @vishalbhoknal2729 7 місяців тому

    Share some sources to dive deep into this interpretation of entropy you explained.

  • @anirbanmandal3123
    @anirbanmandal3123 5 місяців тому

    wonderful video

  • @lomash_irl
    @lomash_irl 2 місяці тому

    This video is like therapy for students who were not taught appropriately 😅

  • @danielwalker5682
    @danielwalker5682 8 місяців тому

    Wonderful explanations.

  • @ritviksharmaphysics
    @ritviksharmaphysics 6 місяців тому

    Thank you.

  • @Brombelade
    @Brombelade 9 місяців тому

    When I saw your video, I was reminded that economists differentiate between microeconomics and macroeconomics. I wonder if we can draw an analogy to the microstates and macrostates you are talking about here. A fundamental issue in economics is whether macroeconomic variables (such as inflation/deflation, growth/recession, (un)employment) can be explained as emergent from microeconomic behaviors, i.e., the economic decisions of millions of individuals. The current state of the art economics says it cannot. But I have always wondered if that is really true, or in other words, if our microeconomic models might be too simple to give rise to the macroeconomic effects we can observe.
    I wonder if in thermodynamics we can observe a similar disconnect between microphysics (e.g., behavior of atoms) and macrophysics (such as entropy).

    • @MarbleScience
      @MarbleScience  9 місяців тому +1

      Interesting question! In thermodynamics / computational chemistry at least I think it is mainly a question of computational resources. With the current resources you can simulate a single protein on an atomistic level for a reasonable time. If you want to study how a complete cell works or how multiple cells interact with each other, you won't get far with an atomistic model. This however does not mean that it wouldn't work in principle if you had the resources.

  • @CristalMediumBlue
    @CristalMediumBlue 10 місяців тому

    Really well explained! Thanks

  • @ahsaaah7247
    @ahsaaah7247 6 місяців тому

    your way of explanation is such a addiction, can you talk about Enthalpy too?

  • @Roman_4x5
    @Roman_4x5 9 місяців тому

    It would be nice to start from the definition of entropy in the first place ;)

  • @harikumarmuthu8819
    @harikumarmuthu8819 7 місяців тому

    Hey, can you explain phonons and explain the difference between temperature and sound using phonons, as both of them are different manifestation of the phonon.

  • @nickb863
    @nickb863 4 місяці тому

    Could you please explain non ergodicity (systems where it doesn’t visit all possible micro states) as well as how chaos can turn to order or vice versa? Psallis entropy?

  • @richardogujawa-oldaccount1336
    @richardogujawa-oldaccount1336 8 місяців тому

    Love this guy

  • @deepakkumar2078
    @deepakkumar2078 7 місяців тому

    what kind of knowledge required to make this kind of simulations. plz tell the tools and techniques to make this type of illustrations.

  • @danielrhouck
    @danielrhouck 9 місяців тому +1

    You show two specific microstates and say they have the same entropy, which in a sense is true, but there are natural ways of analyzing the information. I don’t just mean natural in human psychology; I mean mathematically natural definitions.
    Computer science has an entire sub-field for information theory which does let you talk about the entropy of individual bit sequences. There isn’t quite a full definition, but there is up to a constant, and it takes more information to specify a random arrangement than it does to specify “the left half”.

    • @MarbleScience
      @MarbleScience  9 місяців тому

      I had quite a long and interesting discussion about that with @dv2915 here in the comments: ua-cam.com/video/QjD3nvJLmbA/v-deo.html&lc=Ugw6x5x_bkd0TVExFyR4AaABAg
      Maybe you can add something to that :)

    • @danielrhouck
      @danielrhouck 9 місяців тому

      @@MarbleScience The link isnʼt working for me; it just takes me to the video not to the specific comment. I blame UA-camʼs site.

    • @danielrhouck
      @danielrhouck 9 місяців тому

      I found it. You are great at explaining things and should have a wider audience but for now having a small audience makes it easy to find a specific thread.

  • @Lastonestanding7
    @Lastonestanding7 8 місяців тому

    I was always told that entropy is arrow of time its the only thing that differentiates past from future since concept of things in a system evoluving in more likely state itself has no physical meaning then does time loses its meaning too ?

  • @BabaBoee5198
    @BabaBoee5198 10 місяців тому +2

    Yay

  • @giefuser
    @giefuser 9 місяців тому

    What is the connection between entropy and potential energy?

    • @MarbleScience
      @MarbleScience  9 місяців тому

      I am thinking about making a video about that. Stay tuned :)

  • @adel_mdfkrjones
    @adel_mdfkrjones 7 місяців тому

    Bruh, can u tell pease, how u do ur animations?, I'm currently doing a project for diffuse transmittance testing in optical phantoms and I'm struggling to find how to simulate it.

  • @LuisAldamiz
    @LuisAldamiz 9 місяців тому +2

    So entropy is just about probability? It's weird that modern physics is so obsessed with chance, really.

    • @MarbleScience
      @MarbleScience  9 місяців тому +3

      It is, yes! Everything that involves a lot of atoms (that means basically everything that matters to us) is a question of statistics. For my taste we don't even acknowledge the importance of statistics nearly enough.

    • @LuisAldamiz
      @LuisAldamiz 9 місяців тому

      @@MarbleScience - Maybe, I wouldn't want to be interpreted as being against statitics as technique but making it (or probability) not just a method but the very pillars of modern science sounds to me like those neoplatonists who believe that math or information are real beyond actual reality (phisics, nature). Almost certainly a related issue.
      As they say: "there are lies, damn lies and then there are statistics". And then they also say that "the easiest person to deceive is yourself".
      My issue would be anyhow that maybe physicists are taking probability too seriously in many ways, including quantum mechanics. After all, it seems to imply that blond people are more entropic than brunet people, which are much more common globally, and I see no reason for that: rareness or peculiarity would be a better name than entropy, IMO.

    • @Vlow52
      @Vlow52 6 місяців тому +1

      If you want the world to be precise and accurate, it will only disappoint you. Physicians tend to make a theory that correlates with the outcome and believe in like it’s a reality, but it’s always a narrow relative view based on the measures and conceptions that were also just believed to work. No matter how advanced and broad a theory could be, it’s just a theory and can’t explain the whole system because it is a part of it.

    • @LuisAldamiz
      @LuisAldamiz 6 місяців тому

      @@Vlow52 - It's fine: experimental oucomes are evidence. What has been discovered is OK, the problem is on how they want to make sense of it by slashing out General Relativity and not being humble about what we truly know re. Quantum Mechanics.
      The problem is not of faith on experimental outcome but of way too much faith on maths and also lack of interest in further advances. Plus probably some questionable Newtonianist leftovers, not so much in Relativity (often accused of being "classical" because of lack of quantum granularity) but in Quantum Mechanics (which retains stubbornly Newtonian time and space since Dirac failed at achieving Unification and Shrödinger took over from there.
      Shrödinger's equation is neither dead or alive, but nobody seems to dare to open the box.

    • @taktsing4969
      @taktsing4969 4 місяці тому +1

      You know, God does play dice, a really obsess one.

  • @ab-tu5wc
    @ab-tu5wc 10 місяців тому +1

    Cool video, I was wondering what you use to animate this stuff. Is there a resources for making stuff like mathematical simulations in a software like blender/maya or are there other software better suited for this kind of thing

    • @MarbleScience
      @MarbleScience  10 місяців тому +2

      Thanks! I use blender for my animations. This time I made quite heavy use of blenders new geometry nodes and simulation nodes, E.g. to visualize the marble arrangements. There are not many tutorials specifically for mathematical stuff, but once you understand how it works, it is no problem to use it for anything you like. There are plenty of great general tutorials here on UA-cam.

    • @ab-tu5wc
      @ab-tu5wc 10 місяців тому +1

      ​@@MarbleScience Thank you for the information! The visualizations are great as usual, keep up the great work.

  • @jareknowak8712
    @jareknowak8712 10 місяців тому +1

    👍

  • @nicholasdepaola3740
    @nicholasdepaola3740 7 місяців тому

    Bro you good😮

  • @eyesburning
    @eyesburning 10 місяців тому

    Amazing video as always! Where can I get the marble box you had in this video? Did you 3D print it?

    • @MarbleScience
      @MarbleScience  10 місяців тому +1

      Thanks :) Yes it is 3D printed, and I have uploaded the files to Thingverse for you and other people who might find it useful:
      www.thingiverse.com/thing:6361546
      The problem is that with a new account it apparently takes 24h until my upload becomes public. You will need some patience ;) sorry

    • @eyesburning
      @eyesburning 10 місяців тому

      @@MarbleScience Amazing, thanks so much! I will check back in 24 hours. And where did you buy the orange marbles if I may ask?

  • @rockapedra1130
    @rockapedra1130 9 місяців тому

    Same

  • @kevon217
    @kevon217 8 місяців тому

    that damn bird!

  • @dv2915
    @dv2915 10 місяців тому

    Not quite correct, in my opinion. Take the following definition. Entropy is the length of the message needed to precisely describe the state of the system.
    Now, one state is 'all nine balls are on the left side'.
    Another state is 'one ball on the left side in row 1, column 3, two balls on the left in row 2 columns 2 and 3, two balls on the left in row 3 columns 1 and 3, two balls on the right in row 1 columns 1 and 2, one ball on the right in row 2 column 3, one ball on the right in row 3 column 2'.
    Do these two states look like they have the same entropy?

    • @MarbleScience
      @MarbleScience  10 місяців тому +1

      Interesting Question! Thank you!
      I think the problem with your definition is that it mainly measures how efficient a language is at describing something. Each language would need a different number of characters to describe a state, and we would end up with different entropy values for every language.
      Maybe another language has a simple word for the locations occupied by marbles in the second state, like our language has a word for "left".
      Also, now that you have watched the video I can simple describe the second state as "the second state". That's a short statement. Is the entropy lower now?

    • @dv2915
      @dv2915 10 місяців тому

      @@MarbleScience I'd say not. The full message now has to include both the previous description of 'the second state' and your last statement.
      Programmatically, you can first declare two matrices of a certain size and then simply say that matrix one in filled with ones and matrix two is filled with zeros. Any other state will require providing 'snapshots' of both matrices. And that will lengthen the message, no matter what language you use.

    • @MarbleScience
      @MarbleScience  10 місяців тому

      @@dv2915 But then wouldn't the "full message" also need to include a definition of the language?
      We can't take it for granted that the recipient knows our language if we can't take it for granted that the recipient knows my video 😄

    • @dv2915
      @dv2915 10 місяців тому

      @@MarbleScience True, the full message would include if not the the definition of the language then at least the dictionary of the terms. But still, the principle holds. Like, your example of 'the second state' would require a 'dictionary' of all possible states for this set of matrices and marbles.

    • @MarbleScience
      @MarbleScience  10 місяців тому

      @@dv2915 Typical programming languages might contain a built in function to directly generate a matrix of all ones or zeros (because their creators thought that might be useful). They don't have to contain a built in function to generate every possible state directly. Now I could ship a programming language that has a built in function to directly generate "the second state". Why would it be ok to have a built in function to generate a matrix with all zeros without having functions for all other states, but not ok to have a built in function that creates "the second state" without having a complete "dictionary" of functions for each possible state?

  • @andracoisbored
    @andracoisbored 6 місяців тому

    entropy makes my brain hurt.

  • @HansLemurson
    @HansLemurson 9 місяців тому

    Soo...entropy is in the eye of the beholder?

    • @MarbleScience
      @MarbleScience  9 місяців тому

      In a sense yes. However, if two people choose the same macroscopic variables to describe something they will end up with the same entropy values. It does not so much depend on the observer but instead on the perspective they take. E.g. if we choose to count the number of marbles on the left side of the grid, the entropy is well defined for that that way to look at the system, and anyone who takes this same perspective will get the same values.

  • @Zeropadd
    @Zeropadd 9 місяців тому

    🤯

  • @Dennis-hb8tw
    @Dennis-hb8tw 10 місяців тому

    3D printed marbles! yippie!

  • @jnrose2
    @jnrose2 4 місяці тому

    TIME OUT !!! This rendition is a limited and not thorough depiction of Entropy as identified 300 years ago. The interpretation of comparing statistical microstates (static topological geometric options) obliterates and ignores the original property … the ever diminishing ability to: DO WORK. (capacity to induce ‘actioned change’ … produce utility of INTERSACTIONS. ‘That is: transforms over time …. Less or More … possible. This is a distinctly Different Rendition of “possibilities”. DYNAMIC AVAILABLE OPTIONS (sequentially related) -versus- the characterization he presented …, of mixed happenstances … no importance on mandated causalities involved. His version here allows any sequence or one state can randomly generate any of the thousands others. Behavior PATTERNS were the Entropy Qualia identified during the Industrial Revolution. …, NOT: indifferent states comparisons …, as presented here!!! **** YES … the math notions presented here are okay. They are clinically ‘correct’. It’s just that those math correlations totally mislead on the natured of PROCESSES … generating. ‘Unrecoverable’ …. Order … and continual lessoning of abilities to diffuse PRODUCTION OF UTILITY functions. **** This video … and any similar ones … are dangerously misleading!!’ . *** His math is correct. Only INCOMPLETE.. and depicts the wrong properties of. …,GRADIENTED states changes which have to do with WORK (per 1700’s and later … concerns) and -now- in the Information Age … with COMMUNICATION and induced BEHAVIORS CORRELATIONS (as more important than thermodynamic!!!). IG visitor (Follower) … there to here…: jnrose2

    • @MarbleScience
      @MarbleScience  4 місяці тому +1

      Maybe "Order" (whatever that is) is not that unrecoverable after all. How else would the world have ever gotten into the state it currently is in? If entropy was a one way road, how do you explain the existence of low entropy states?

    • @jnrose2
      @jnrose2 4 місяці тому

      @@MarbleScience. Perfect question/notion ! ***Entropy (as traditionally modeled) is **-not-** an absolute monolithic ‘tendency’ towards ‘disorder’. [eg, the intangible ‘energy dissipation’ gradient is always identified in small sample circumstances ~ observations … **-then-** spoken of as some ‘absolute universal running down’ … diffused ineffective … dispersion of energies through n-dimensional Space(… spacetime) ]. *** Obviously, different real states & architectures of “forms ~ functions” regularly … at all known levels of Complications … -do- ‘recover Order’ spontaneously. Although it is not “recovered’ duplication of immediately prior states (which - if reversal of a process stream can -be- reversed - needs added -external- energies to do the task). ** Ordered new energy accretion groups are generated by a NON statistical mechanism (same process, different enacting forms ... at all levels of Complexity production), Operational Function Potential … in the universe. *** Micro-states comparisons are NOT the “mechanism” involved (even though the statistics math seems to be some kind of ‘marker’ -for- the mechanism). *** ALSO, the conventional fallback analysis to justify Complexity and Emergence… is written everywhere as must being some FOURTH LAW of thermodynamics. [that is a typical logic deduction … but … looking deeper into the relations involved … that is Wrong. No “4th law” is required (!). And especially … no statistical properties will describe Negentropic order generation]. The explanation for Local Regional (conditional) “Complex Order generation” … has to do with several co-involved COMMUNICATION FACTORS between clusters of agents present in an ‘events set’ [ many different architectures are possible ; shared ‘interactions properties’ are what are universal despite architectural forms ‘differences’]. ****. Thermodynamics is only ONE FORM of the involved relations & properties. (which is why we see it so often and we focus on entropy as a trade off of energy states. The reality is … energies are only a subset of more Ubiquitous… Communications aspects. Eg - fields of force .. are **-not-** defined as ‘energies’ but they interact and produce both entropic **-and-** negentroepic outcomes. (!!!!!!) ). *** *** Your conventional depiction of statistical factors involved … is not Wrong. It is a sidebar situation monitoring math. It is not a process explanation math. That is a big distinction. -- I hope you understand. -- jnrose2 (IG)

    • @MarbleScience
      @MarbleScience  4 місяці тому +1

      "Your conventional depiction of statistical factors involved … is not Wrong. It is a sidebar situation monitoring math. It is not a process explanation math."
      I kind of agree that on a fundamental level the statistics is not the reason why something happens. Let’s take a classical lottery with balls in a rotating container as an example. Of course the actual reason for the numbers we draw is in the physical interactions of these balls colliding with each other. However, this more exact physical approach quickly gets too complicated for us to follow in detail. That's why we come up with simplified models, and these simplified models are heavily governed by statistics. E.g in the case of the lottery we typically ignore all the exact physics and simply assume that it is equally likely for every numbered ball to be drawn.
      “The explanation for Local Regional (conditional) “Complex Order generation” … has to do with several co-involved COMMUNICATION FACTORS between clusters of agents present in an ‘events set’ ”
      Honestly, this doesn’ really tell me anything. To me it seems like there is a much simpler answer. Processes with a negative change in entropy have a small but non zero chance of happening. The chance for all the matter required for the big bang to randomly come together is (for us) unimaginably small but it is not zero! That means, if we wait for an infinite amount of time it is actually guaranteed to happen at some point. Also, if the Universe is infinitely big, it is guaranteed to happen in some part of it.
      I think all the matter coming together randomly for something like a big bang only seems unlikely on the time and size scales that we are used to thinking about.

    • @jnrose2
      @jnrose2 4 місяці тому

      Fine comments. Thank you. [please bear with me- I don’t know how to copy-paste UA-cam comments]. I am glad you recognize the distinction between equations types - most folks don’t assess which equations do what. *** I want to follow up your pgh that starts “Honestly…”. Your “simpler answer” again falls into the “non explanation” type of equation. Similar to Eddington who also intoned that if there is a non-zero probability, that that suffices to account for negentroepic complexity generation; simply on the criteria that ‘if it is “allowed” then when complexity happens, it is justified‘. Such a monitoring assessment does ‘simply identify’ that complexity (Negentroepy) “can” happen, not “how” it happens (per a universal performance operation .. at every level of complication). **** Your context circumstance … the life time of the universe - does not map to the ongoing regularity of vast numbers of generated Complexities forming constantly … in all living systems, AND, cosmically … stars forming, molecules forming, solar systems forming, galaxies forming. These are all versions of Easily & Ubiquitously formed … negentropic Complexities. (!) *** The statistics “allowable” description does not cover the huge number of actual productions. ** As far as my reference to factors of communication being involved, I do include a probability factor in that model. The probability of messaging between 2 (or more) entities … an active negentropic forming requires that possible information or energy transfers-interactions MUST be “c > 0” in order to maintain linkage, coupling, complexity. All involved communications must sustain “greater than zero” probability. 😮😲😃👍. ******* I hope that makes better sense to you. 🙏 -- jnrose2

    • @MarbleScience
      @MarbleScience  4 місяці тому +1

      “Your “simpler answer” again falls into the “non explanation” type of equation.”
      You are correct, but the more fundamental physical explanation would require us to know the exact state of gazillions of photons, electrons, atoms, etc. which is just impossible. The best we can do is a statistical assessment of the situation.
      “Your context circumstance … the life time of the universe - does not map to the ongoing regularity of vast numbers of generated Complexities forming constantly … in all living systems”
      I think there are two types of entropy reduction that we should differentiate:
      1. Local reduction of entropy that is linked to a greater increase in entropy at a different location.
      2. Overall entropy reduction without a linked increase at some other location.
      If I understand you correctly you are talking about the first kind. E.g. If I eat something, I might grow some muscle tissue (which might be a reduction in entropy), but at the same time the entropy is increased by breaking up the carbohydrates I’m eating, and distributing most of the carbon atoms as CO2 in the environment. Even if there might be a local entropy decrease, overall entropy is increasing.
      This kind of local entropy reduction at the expense of a greater increase elsewhere, is not unlikely at all, and indeed it happens all the time. These are processes with an overall increase in entropy after all. However, they can not explain why the world would ever not be in its maximum state of entropy in the first place. At some point a process with an overall negative change in entropy must have occurred.