We need to make a distinction between theory and framework. Wolfram is creating a new framework, not a new theory. But, you can express different theories using a framework.
Physicist are working on this. It's called "constructor theory." There are simple rules that describe time, space, and objects which must follow rules, aka, constructors. The system then evolves according to the constructor rules.
Sabine didn't mention something that I thought was most curious about Wolfram and Gorard's work, which was in the way they worked with hypergraph rewriting methods. Firstly, by choosing hypergraphs, they were allowing all possible topologies, and secondly, rather than choosing some specific set of rewriting rules that they thought might work, they chose instead, to integrate across all possible hypergraph rewriting rules. So, they're not imposing any topological structure and they're not imposing any structure to change over time, and yet structure emerges nevertheless, which is astounding. Many rewrite rules are computationally equivalent - so they reduce to their simplest form for predictive purposes. Many rewrite rules produce no structure at all, and so they can be ignored as background noise. Many rewrite rules produce only momentary structure that self destructs - think like virtual particle pairs that emerge and self-cancel. Many rewrite rules produce structure that is computationally reducible - and hence the kind of structure we focus on in macroscopic physics where we can predict the outcome, because it can be computed faster than the system that actually enacts it. Many rewrite rules produce structure that is computationally irreducible - like we see in quantum physics, where there is a probabilistic distribution of potential outcomes the could be determined through Feynman's sum over path integral approach, but which could never be computed faster than the system itself operates. Essentially though, the structure we observe, is the recurring patterns that emerge from the rewriting rules that actually do produce recurring structure, while everything else naturally falls away.
Absolutly. i made a comment here on this as well, pretty much (studied W-Model for 4 years, seems like you have as well). This video basically reads "lets skip mass energy equivalence and cut straight to n = 4 super yang mills" esque reporting on the W-Model, which is the same mistake pretty much every communicator makes on the project. Comp equivelence, Comp irriducibility, the Ruliad...these are super important bedrock ideas that underly the physics model, which has the nature in which you are talking about here in your OP. Sabine starts talking about Causal Sets...which is like a very small portion of the W-Models way of (rigoursly) formalizing the topology to a continuous spacetime manifold...and this conversation happened 4 years ago (Look up Wolfram Physics Project : A discussion with Fay Dowker) The most important aspect of the W-model is the idea of convergent physics that you are highlighing in the OP...that once we say there is this causal set of relations (causal graphs) and finite observers embedded in this graph that try to understand what's happening, then what follows is quantum mechanics, relativity and statistical mechanics. That these three bodies of physics (and the hypergraph) follow inevitably as a product of just existing in this Ruliad object. In other words, in a strcuture that is "running all rules" there is convergence to the same laws of physics = therefor no need to "find the one rule" which was a weird proposition anyway. i don't know why science communication is so hard. But anyway if Sabine is supporting Wolfram, i'm down for that i just hope she gets the bigger understanding eventually so she can do the proper justice.
^^ Agreed! In my own metaphysical musings, I've kept coming back to the idea that topology is fundamental. Abstract rewriting basically covers how one thing can become / or be causally related to another thing. The topology of a set of things is like its structure, something observable. And how the topology changes according to rewrite rules is akin to how we observe a dynamic universe where these observable phenomena change over time. I believe taking a unbiased approach of summing over all possible rewrite rules can be compatible with the notions of relativity and entropy/randomness. It seems to make sense that the universe is trying all possible transitions from one configuration to the next, but that some transitions negate each other, leading to symmetries of the universe, such as the principle of stationary action. It's like within the space of all possibilities of the universe, there is a manifold of possibilities that follow these confluent laws of nature. And a relativistic frame of reference is like a projection of that higher-dimensional manifold onto a 1D submanifold, being that observers' local perspective of causality. And finally the random nature of the sum of rewriting rules leads to increasing entropy and the perceived direction of that local causal submanifold.
@@NightmareCourtPictures if this works, would it explain the "fine tuning"? If you start from nothing and physics still falls out, then that implies that this is the only way things could have been, right?
@@samuelwaller4924 Yep that's right. I believe fine tuning is well explained under this model. There's no need for Multiverses or carefully constructed initial conditions.
It's refreshing to see that Wolfram actually listened to the flaw in his theory and found someone to help address them. Good to see that Gorard used the work from the causal sets people instead of reinventing it. Even if the theory/direction is shown to just not work, at least we would know what to not work on, which would already make it better than string "theory".
I'd wonder if the progress in analog (computer) chips that would have been made or published in about 2022 bring any comprehension advantage, but i'm not an expert at that "topic".
Refreshing? Are things so bad in science that its no longer standard practice? Isn't it still the point of peer reviews to find flaws for the author to address or alternatively to confirm or supplement the work in case its found to be valid?
@@1112viggo According to other of Sabine's vids, the point of peer reviews is to have professors and researchers check each other's work for free so that the publishers can make money. The work is required by university professors so it becomes more of a quota to meet than to progress scientific knowledge and therefore is just busywork. Peer reviewing does find errors, but most published articles are just rehashing of the same discoveries so there isn't any mistakes to begin with because of replicated work.
@@user-fk8zw5js2p lol right, i saw that episode as well. But id like to think that the original intention of peer reviews still servers its purpose, albeit with a modern twist of exploitative bureaucracy😅
Their current status seems to be that they can explain superposition and decoherence in quantum physics, and they can derive general and special relativity from the starting point of hypergraphs. This is already amazing progress, as it hints at unifying the two theories. The big unsolved issue ATM is accounting for particles. At the level of the hypergraph, particles are probably vast self-repeating patterns in the graph, but it's been difficult to work out how they work, so we're very far from deriving the standard model.
"Wolfram is a curious case because he's not your average crank." - in his teenage years at Caltech, he was publishing research papers in theoretical elementary particle physics that were world class. He was prepped to become the next great theorist and then he changed directions. He's not really a mathematician, he's always been a theoretical physicist at heart and his mathematical approach was a way to get at the patterns deep in nature. It has been a many decades long project and even though I don't think it will be able to derive physics from these supposedly deeper principles, it is an astoundingly original and interesting effort - from a genius.
Physicist here. I had dinner with Wolfram recently and he is definitely not a crank. He has deep curiosity to learn all he can about areas of physics that are relevant to his program but are new to him. He is humble in style yet confident that he is on to something. We need people like him who have original ideas and the skil, time and money to pursue them (He became wealthy from his invention of Mathematica - a software program that is now used by thousands if not millions of scientists every day).
@@squeakeththewheel When I was in high school preparing to become a physics major in college, Wolfram won the MacArthur prize and a small article in the paper about it brought him into my consciousness. My plan was that I would be his student at Caltech - I couldn't wait. Unfortunately, he moved on from Caltech before I graduated from high school.
give me the names of the "world class" papers he published as a teenager. I've seen this claim repeated everywhere yet no one seems to know what those supposed papers are
@@sereysothe.a By 1981, he had published 25 papers in theoretical and computational physics including at age 18 in 1977 a paper on heavy quark production in QCD 1978 - Discovered early connections between cosmology and particle physics 1978 - Fox-Wolfram variables for analysis of event shapes in particle physics 1979 - Discovered Politzer-Wolfram upper bound on masses of quarks in the Standard Model 1979-1982 he developed computational QCD approach to simulation of particle events.
I'm obsessed with this Channel. Not only it's content but also the presence of Sabine Hossenfelder. She gives so much structure to physics. Thank you from Portugal.
"The story so far: In the beginning the Universe was created. This has made a lot of people very angry and been widely regarded as a bad move." - Douglas Adams
Waiting for someone to figure it all out so it can be immediately replaced with something even more bizarre (but might make more sense to us ape descendants...)
I did stats classes in college and there were two professors I really enjoyed. They had different approaches to fitting a model. One was very fond of running monte carlo simulations and working out a model from the observed data. The other would like to work out the math for the model they theorized it would be, then tested it afterwards. A machine gun vs a sniper rifle. Either approach would probably work, and it was more appropriate to use one method over the other with certain problems. I feel like Wolfram is simply taking a different approach from a 'traditional' physicist because the bulk of his life's work wasn't in traditional physics. Perhaps physicists had a hammer in their hand and saw the problem as a nail, and Wolfram has a screwdriver in his hand and sees the problem as a screw. Time will tell which approach bears more fruit.
I asked under another comment but I will ask here also since I see so many links in people comments: How did you add it? Why did your comment did not get auto deleted?
Technically, with Monte Carlo simulations you’re not running the simulation first and then figuring out the model afterward from the data that was produced. Rather, you have to decide on the model _beforehand_ (i.e., the probability distributions involved, their parameters, and the logic and math of how their samples interact), and then you produce simulated data from that pre-defined model.
@@anonanon6596People don't add those links. It's a new "feature" of UA-cam which automatically adds those to keywords fitting to other UA-camr videos.
I also love Jonathans attitude towards it. Even if the computational approach doesnt quite work it may still produce mathematics that does reveal new ideas, for example the fact that curvature can also be a measure of non-integer Hausdorf dimensionality. Stuff like that. The way they are incorporating quantum mechanical in their models might actually help teach us a lot! More to them
@@DrummerRF … but the problem Wolfram has is that his model is so radical that it doesn’t really square with any extent models in use. So to date he hasn’t even come close to providing anything even as simple (in conventional frameworks) as the g-factor calculation. In fact he’s made it clear that that would be a considerably lofty goal.
One of the mistakes I see critics of Wolfram's approach make is assuming that the computation steps correspond to increments of time. They don't, and this was made clear even in Wolfram's 2002 book (A New Kind of Science). The other mistake is failing to understand that observations from "outside" the hypergraph are not possible; the observer is itself a substructure of the hypergraph, and this affects what can and cannot be observed.
Exactly this. Slice thru the hypergraph at one angle, and the links you cut define space. Slice thru it at a different angle, and the links you cut define time. Rotate these slices, and you get space and time switching just like relativity says. I'm not sure why it's based as "incompatible with GR" since the best and most reliably confirmed theory we have (QFT) is also incompatible with GR. We wouldn't be looking for "a theory of everything" if GR and QFT were already compatible with each other.
"Wolfram's approach make is assuming that the computation steps correspond to increments of time." Not neccesary. issue is that people apply time because of the way computers are dependent on clock cycles. Think of it requiring a transform (ie Z/Fourier Transform: Time Domain to Frequency Domain) to change it from the digital compute domain to the real world domain. We don't have the means to compute the real world in real time with digital computers.
@@antoniopannuti2088 It would be fairer to say it's the other way around, since this is the older comment. @RaeneeCarver-i3x's comment is identical to this one.
I appreciate how Wolfram's approach tries to uncover underlying patterns to explain complex phenomena. It's compelling to consider that what we observe as the laws of physics might just be the surface of a deeper, systematic structure that governs everything.
As i understand it, and imho this is the main reason physicists should have a look at his work, part of Wolfram's genius in his approach is to remove the observer from the analysis and to study the hypergraph as a whole. This allows to sidestep our observer biais when considering motion and other stuff (Since as observers we are part of the graph). His thesis is that the laws of physics are emerging properties of the hypergraph AND that we only can experience them as observers from within the graph. So we can experience the emerging effects, but can't experience the underlying principles by nature. That's a level of abstraction i initialy didn't think we would need but it makes sense given how complex systems behave.
I am a big fan of his observer theory as well. It is very intuitive given that all observers are made of topological solitons of spacetime and are intrinsically connected to the universe around them.
Why removing the observer is a positive action? Observer is and always was part of the system. It is not just a reason to annoy physicists, but a part of the system in consideration.
I wonder the things which can be computable once something meaningful like (don't laugh) Mathematica for Quantum ship. I am not interested in how it can crack RSA. I am curious about the incomputable things. Some are time related.
I was a student at the University of Illinois getting a degree in Engineering Physics and took Wolframs, class on Mathematica. We ran it on Sun Microsystem computers and were essentially helping to debug the first release. I used it to plot electron orbitals in 3-dimension and loved my time in that class. Many of the people who worked for him were also brilliant. Theodore Grey ran that class IIRC and has done some really interesting stuff himself.
@@Olgasys Quantum computability is equivalent to classical computability though. You mean problems that are efficiently solvable by quantum computers?
@@quintium1 Absolutely. A lot of things are labelled impossible to calculate since they would take millions of years. There is another thing like "If you can perfectly predict next month's weather but it takes two months, it is pointless"
I've been fascinated with Wolfram since the release of his much-lambasted A New Kind of Science decades ago, and it's only your videos and discussions of the academic world that have helped me understand his situation. Wolfram is famous for his ego and his radical pronouncements ("A new kind of science!!!1"), but I realize now, what choice does he even have to be heard in a world full of people with their fingers in their ears? His work challenges the academic consensus, and so everyone just laughs at him. I've got the utmost respect for him continuing this work for so long now despite the hostilities. The world needs more original thought.
Over 13 years ago I read Steven Wolfram’s book A New Kind of Science. The book blew me away, from my fascination with Mandelbrot’s equations and the Principle of computational equivalence to simulate systems with emergence on a Turing machine.
I studied the model for 4 years (since 2020). For those that know, have seen me around, recommending to look into it and explaining Wolfram's work formally. I'm gonna explain way more clearly what the W-model is. Wolfram's work in New Kind of Science was running computational experiments, specifically proofs of exhaustion (running entire rule classes) to show the three following facts : 1) That Rules can do arbitrarily complicated behavior. Specifically, behavior NOT DESCRIBABLE by mathematics. 2) That Rules could be generalized as falling into 4 classes of behavior : homogenous, patterned, random and complex. 3) That Rules could emulate each others behavior, like showing how rule 22 can emulate rule 90 with a different initial condition. The 3rd observation from those experiments is the most important one, because in the latter half of the book, he shows that it is this ability to emulate the behavious of other systems, so that he could make a proof for his principle of computational equivalence : That all rules are in some sense sit in the same rule space. He would then go big brain and show that the size of that rule space is equivalent to that of a turing universal machine. So, by proving the Universality of Rule 110, and showing many examples of how these CA's emulate each other then the idea was that you could string together rule emulations to emulate rule 110 to then emulate a turing universal machine. The reason the principle of computational equivalence is important is because it is an equivalence statement for ALL systems. That any system that is following rules, are equivelent to each other, by being classified as turing universal machines. This is also what separates it from other similiar ideas in the space like Church Turing Thesis, which says all systems can emulate a turing machine...Principle of Computational Equivelence is a statement that all systems ARE turing machines. Clear difference. Following the principle of Computational equivalence, is the formal argument of computational irreducibility : That because all systems can be considered equivelent to turing universal machines, then every system is computationally irreducible : Finding out what they do is equivalent to trying to solve the halting problem. Again this is a clearly novel and distinct statement being made...a statement that is stronger than super determinism. The above leads to the construction of Wolframs Ruliad : All systems "sit" inside this single complexity space... are equivalent to each other, to this space of a turing universal machine. This equivelence is a formal : Graph isomorphism, and if you are a system embedded in this graph, you must preserve the structure. For example if i'm sitting in a corner of a room looking at an apple and you are sitting in another corner of the room seeing the same apple, the room is still the same room, we just see the apple from different perspectives, and therefor see different things...but it is still the same underlying structure. Generalize this to space time, where one reference frame, and another can see different perspectives of space-time events...but it is still the same underlying structure hence relativity. The Ruliad is the mathematical object of this fully computed turing machine...and that THIS is the object we are preserving group transformations to. That space, is far more complex, and "bigger" if you will, because it contains all possible events... future, past, everything...so in other words time doesn't exist, we just perceive time, with respect to this ruliad object. Consequently the entire model comes from this aspect alone : That the Ruliad eternally exists, and our perception of the Ruliad is what gives us spacetime, QM and statistical mechanics : as limits of our perceptions of this infinite object as finite systems ourselves. This perception of how we perceive that object, is the W-model. There's so much more to this model, and if you take enough time, you'll realize it is beutiful. Saying that systems are turing machines comes with alot of elegant consequences, such as the fact that systems can be thought of as computers in the same way as we use computers now...that the universe is an ocean of unlimited potential and that we are "sampling" it to get what we need, for things that are useful to us...like a kind of programing. If interested in his work, watch videos in this order : New Kind of Science Series (1 - 16) How universal is the concept of numbers Can AI solve Science Observer Theory Stephen Wolfram Readings: What’s Really Going On in Machine Learning? Some Minimal Models Also read The Concept of the Ruliad on his blog.
@@NightmareCourtPictures Man, I find Wolfram to be worthwhile and interesting and I was also an early interested party in his physics model, but you really want to take anything he claims with a big pinch of salt. He CONSTANTLY restates old principles and sometimes seems to just rename them and claim them as his own… The “Principle of Computational Equivalence” (or whatever he’s calling it) is a great example of this behavior, it seems to me. Turing and Church’s ideas for general computation seemed enough-what is Wolfram actually adding here!? Actually, his whole book ANKS was interesting, but again, it’s not any different than the field of non-linear analysis, which preceded him by DECADES. I wish Wolfram wouldn’t do this. As I said, I find his talks and writing interesting enough on its own merit, and I don’t see any value he gives himself by such grandiose claims. This is all just my opinion-I could be wrong. But I have a strong suspicion that I’m not in Wolfram’s case.
If rules are the foundational building blocks, and by rules I assume we mean anything that can be described / axiomaticly stated, how does this approach avoid the limits of describing reality as discovered through Gödel's theorem (and derivatives) which basically is interpreted as that there are things obviously real and true - part of reality - that can never be computationally proven - deducted from an axiomatic approach? I expect this is considered by thus W, but neither this video or your explanation seems to explain it, except mentioning the halting problem in passing and not saying much about it. Ps. Was quite some time ago I left my uni comp sci math classes behind.
@@chriscurry2496 I explained the difference between the Church Turing Thesis and Computational Equivelence: Saying that all functions (which we can think of as an apple) can be calculated on a Turing machine, is not the same as saying that all systems, including apples, are Turing machines. It was not at all obvious or apparent that the universe was defined as being like a computer program, where systems like apples, could be defined as programs... not until recent times (in the age of Wolfram). Other people around the time of Wolfram, had ideas about the universe as being like a computer, but in general they all miss the mark : trying to use mathematics to describe it...which leads to the point made below: NKS is not like non-linear analysis. One of the big points of the book was to show, that mathematics like statistical analysis are not robust enough to describe systems like the cellular automata...hence the first observation : "Many rules were not describable by mathematics." "Rule 30 does what rule 30 does" is the only way to properly describe Rule 30 and what it does. NKS covers this, and so to does the lecture that should follow watching that series "how universal is the concept of numbers" where he focuses on how numbers fail to describe systems, but it is ultimately this feature "of counting things" that form physics as we perceive them. Equations are constructs made of limitations of our perception of this infinite uncomputable thing that is the universe...again that is novel.
@@NightmareCourtPicturesMan, I’m sorry a lot of things you say here are just flat-out false. The points about Turing/Church are not even correct. Read what Turing thought, for instance: he was very much considering things in the universe to be computational. Your point about cellular autonoma “not being representable by mathematics” is just ignorant. I’m sorry that I am so blunt; but that’s the truth. I mean cellular autonoma ARE mathematical (see Pino Arithmetic for a simple example)! The entire theory of computation (under which CA’s fall) is mathematical! Whenever you see a program, that is equivalent to a theory in mathematics-except that it’s somewhat limited. In fact, the field of mathematics includes constructs which are MUCH more powerful than those in computational theory (see the real numbers, for instance)
Great explanation of how computational, incremental theories like Wolfram's are, like quantum mechanics, incompatible with general relativity. Reminds me of a notion I had: that Xeno's paradox can be explained by the idea that nature is discontinuous, made up of discrete elements of space and time.
This sort of interdisciplinary work should be, with some fair scepticism, strongly encouraged. It may lead nowhere, but the process of an ideas guy joining dots from different specialisations so that others shoot them down, before they are then refined, is healthy scientific inquiry. It forms a sort of search tree and opens up new approaches. Dogma and gatekeeping, eye rolling about heterodoxy, shouldn't be the reason we dismiss these paths.
Gatekeeping is always a good thing. In fact, I don't know how you missed that he's being gatekept here. Gatekeeping is making him come towards something that makes more sense.
@@TessaBain Perhaps because that word can mean different things. Knowledge doesn't respect campus layouts. If people from one discipline have the institutional power to decide the future of ideas they aren't in a position to fully appreciate, that can be a problem. Some physicists make the effort to provide constructive criticism, but others, as Sabine suggested, should take Wolfram more seriously. Perhaps if they had sooner we'd be further along.
@@marca9955 Perhaps, but we can't make as if we had an infinite amount of time and other ressources. One takes seriously what seems promising and needs support or help one can provide. One can be wrong, of course. But Wolfram doesn't really lack ressources and other physicists have a right not to find what he does interesting without wasting their time criticizing it.
You misunderstand at 2:45. His theory doesn’t have time only a sequence of events. Time is an emergent phenomena that is used to make approximate models, but likely makes it harder to understand the fundamental rules.
As Sabine put it here in one comment correctly, "there are two types of time, the one in which you calculate the hypergraph and the type that the graph represents"
@@javiersoto5223 no, you'd need to have an (emergent) total ordering on events from the model for it to have time that behaves 'like time'. that's a further restriction that his more general models don't necessarily need to follow
Sabine, I've admired your skepticism of junk science so much, and so I was pleased to see you're also encouraged by Wolfram. I think he's onto something, but you're right his work is so hard to understand. This was a great video. Please never stop doing them!
I’m curious, what’s hard to understand? He’s just saying that what we perceive as time and space are the effects of multithreaded computation constantly rewriting a hypergraph.
@@GeoffPlittWell yes you can follow the math. It’s Conways game of life on steroids. Trying to visualize emergence is where it becomes difficult, but we’ve all seen how simple computational rules produce complex emergent behavior. That behavior is what he means by computationally reducibility arising in otherwise irreducible computational spaces.
@@GeoffPlitt Ad hominems begin where wisdom ends. As a computer programmer for over 30 years, I see this differently. The math is quite simple. If you're having trouble, it's likely because you need to learn about emergent behavior in complex configurations of otherwise elementary computations. Start by trying to learn how a simple calculator works at the mathematical level. Then move onwards to how a computer works. Can you explain how a piece of silicon can produce the ability to sit here and have this discussion while simultaneously allowing us to view Sabine's video? If you cannot do that math, then you're going to have a lot of trouble following Wolfram's computational model because what he's saying is that all our physical laws arise from computation of very simple rules.
"There is a theory which states that if ever anybody discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another theory which states that this has already happened." - Douglas Adams My theory is that it has happened more than once.
This is a great explanation of Wolfram's theory, thanks Sabine! I've been trying reading it and understanding for quite some time, but your 12 min video definitely put it in the right perspective.
Wolfram recently did an entire series on his youtube channel where he goes through every chapter in his New Kind of Science book and just talks about the ideas in each chapter. It's a wonderful set of videos, and you get to hear the interpretation of the ideas in the chapters straight from the source.
One nice thing about this is it brings a different approach to the questions than most scientists bring which will open up new clarity in explanation and most likely new learning for science too.
My dream is to see you discuss this directly with Wolfram in a podcast or something. I think it would be incredibly productive. Please please please pleaaaaaase :)
Actually there is an answer; the energy is what we create, light is made when we make our ATOMS interact so that they may then produce PROTONS. All things are vibratory and there is a right answer for everything.
As an undergraduate, one of my lecturers at university was Fay Dowker, who was a student of Sorkin and works on Causal Set Theory. She’s an awesome lecturer and gave us an outline of the theory in one of her lectures. I remember Wolfram’s theory coming out around that time and thinking “this sounds a lot like Causal Sets…”. Interesting that he and has research team have taken the ideas of Causal Sets on board.
Ok, here's one of mine. Enjoy... What if Dark Matter and Dark Energy are rivers, pockets and/or bubbles of slower or faster moving time? We already know time can be manipulated. Imagine seeing an electron orbit however many times a minute. We set that as our base reading, right? For our plain of existence, anyway. In these pockets of Dark Energy or "Light Time," that same electron now has readings of a billion times a minute because time is "light" here and passes a billion times faster. Giving us readings of energy where there shouldn't be energy = dark energy = Light Time. Dark energy IS "Light Time." It's the difference in TIME that we're seeing repelling everything around it Now we look at the other side of the coin. Dark Matter or "Heavy Time." "Heavy Time," would be slower rivers, bubbles and/or pockets of time. That same base electron now moves a billion times slower and appears to stand still. "Starving of time," or slower pockets in time, creates a vacuum or attraction in space. Making it appear dark matter is attracting when really it's just a vacuum, IN TIME, trying to equalize. If I'm right, a white hole's existence flashes by in an instant and a black hole's existence lasts forever.
@0:22 I thought Stephen Wolfram was a theoretical physicist who became a computer scientist later on when he developed Mathematica. According to Britannica, he got his PhD in theoretical physics at the age of 20 from CalTech.
@@lluvik2450I think it is because the topic is theoretical physics and Sabine is a theoretical physicist. When she says he is a computer scientist and a mathematician (which he has no degree in either by the way) it gives the listener to the video the impression that Stephen Wolfram is working on something that is not within his field of study and thus diminish his credibility. This is like saying Albert Einstein is giving a lecture to a bunch of biochemists about his theory of how life was formed on Earth.
I join Wolfram’s weekly streams as much as I can. I love that he engages with viewers and answers questions. He’s without a doubt one of the most intelligent individuals I have ever known.
Are you aware that to be a mathematician you must create new mathematical results ? Your claim is like saying that the queen of England is the greatest marketer in human history.
One thing i like about Wolfram is his wealth of knowledge about the history of physics and maths and where physics has come from. Apart from having a PhD in Theoretical Physics from a young age, i'd argue he has a better understanding of the context and history of where ideas in physics come from than most physicists
IDK why this is (I have attention deficit disorder) but learning the 'historical framing' of discoveries is, like, key for me. It's like with the story of Eureka! and Archimedes - the story itself is the 'meme' in my gray matter that I can utilize in speech and thought. The discovery is the science at the inside of that meme. OR SOMETHING IM JUST SPECULATING. Wolfram's historical encyclopedic knowledge of physics has been tremendous, for me. And its all free. What a man.
Nah.. there's nothing worse than pure mathematician poking into physics! They always mess things up! Always made abstractions that had no ground in the real world. Sure they can "polish" a theory.. but no pure mathematician ever came up with an original idea in physics. Knowing history helps nothing here... And, yes, before getting into a debate.. Wolfram is far from physicist no matter what he claims! His whole body of work is nothing more than mathematics.
Hes learned a lot more doing his work for the majority of his life than he learned in grad school! So if anything hes a computer scientist and buisnessman!
Wolfram was the only one crazy enough to actually break through this calcified field. And Gorard is taking it to the next level. Gogogo! We are rooting for you!
Thanks for the wonderful video Sabine. I worked on Causal Sets for three years on black hole entropy. I worked on string theory for a year. In my personal view, causal sets is the simplest yet most elegant theory of quantum gravity. It is unfair how science fundings are not distributed equally among different approaches to quantum gravity. Many young physicists are forced to choose a different career trajectory because of this. I really appreciate you bringing causal sets in your platform for people to notice. It is through joint collaboration and sharing of ideas between different approaches how we can efficiently tackle this century long problem, not by claiming that 'our theory is the best and everything else sucks'. I think science community should appreciate ideas like 'causal sets' more.
I don't know much about quantum gravity, but I can comment that *in genera*l it does not make sense to distribute money equally among various hypothesis to solve a particular problem on any given field. The funding agencies, be that public or private, want to get return for their investment. More so in private, but certainly so also in public funding. This means there has to be scrutiny which studies gets funded, as otherwise money would simply be wasted on massive scale. Science has always looked for consensus to determine what works as the absolute vast majority of ideas that would break fundamental aspects of consensus theories are not true and likely also rubbish. Science is conservative, but at the same time ready to change when evidence is produced against an existing theory. If you can convince someone to fund your study, great go for it, but don't expect society to fund you just because you had an idea. In our society there seems to be this underlying current of thought that science and scientists are in some ivory tower looking down to people "who really know". All the scientists, at least the good ones, are very down to Earth people who really want to solve the problems we have in our society. We need to support them, not demonize them.
@@hubbeli1074But how do you know when you scrutinize whether an idea can cause a breakthrough in the long run or not? What we've seen is that the public funding basically puts all their bets on 1 thing which is determined by an informal popularity contest (public bias)
@@BboyKeny Do you have a study or facts other than your feeling that this is the case? My main comment was that "equitably" funding all ideas seems like a really bad idea. I want my tax money to be spend efficiently and as I am not an expert I can only assume the real experts making the funding decisions are doing the best they can given restrictions in amount of funding and e.g. political pressures to which they don't have any say on. If there is an argument against this stance, then a properly articulated opinion with concrete examples of what should have been funded with expense of something else should be presented.
Dear Sabine, I very rarely leave comments, but in this case I have to... I loved this video!!! I respect and appreciate Stephen Wolfram's work immensely
I can appreciate the fact that somebody is willing to pursue their ambitions despite mass criticism. Double kudos when those ambitions aim to enhance my understanding of the universe.
Computer program doesn't need to show things in incremental way. Program can compute the whole state of universe that will be "shown" in next step and show it. So computation can still be done incremental way, but universe would never feel that, because universe only sees already calculated "state". That step might take very very long "time" to calculate in "place where that computer is working" but it can't be felt in our universe. That might even work in a way that in each step there is horrendous amount of loops that are converging things into the desired values with set precision (that I guess don't necessarily need to be set - but can be different for each step and each value because this precission might be estimated based on something).
Take electromagnetic radiation. It exists in waves, is analog. Currently, in the binary epoch, we use an analog to digital converter for processing and managing the wave. The binary product is a representation that mostly works, but it is not the phenomenon itself, which is forever lost in the conversion. So if we convert using a nearly infinite number of binary steps, the phenomenon being converted is lost. Say we take a grand microscope and look at the elementary components of the model, we would see binary steps, not the analog radiation. The model and the event are substantially different in implication, to a rather large degree that cannot be overcome through more stepping. To me, a continuous phenomenon should be modeled using a continuous system.
@@damianlukowski9996you could never prove it. if the characters in a video game can never see the code, as they’re trapped in the simulation they can never prove they’re in a simulation.
@@Humungojerry Yeah this just is not a sensible conclusion. It may or may not be the case that digital dwellers "in the game" unmask the code which implements them. It's rather analogous to actual physicists unmasking the laws of physics--whether or not we are ultimately successful depends on many things, such as how accessible the laws are, how complex / simple they are, etc. In the case of digital physics, I think the proper analogy would be a virtual machine. It's known that some exploits allow for guests to discover and break out of their virtual machine container, and thereby start interacting with the host system. In short, it depends. It may or may not be the case that such creatures could discover their implementation.
Hmm, his space as a network theory does have a certain charm to it. GR says that space cant have an absolute reference frame/coordinates, but a network doesnt have one either. Its all "relative" by definition and so kind of automatically aligns with relativity. If thats the case then technically quantum effects happen on the node scale and gravity is simply the arrangement of the nodes together ie their connections. Very intriguing. Its less that reality is a simulation but rather computation itself is an exercise in emergent behavior and properties. And so if we use what we have learned from studying computation we can see that the network data structure might be a nice fit as the simplest organization of reality that can still lead to the emergent properties we call physics. Its worth investigating and its a better reason to do it than the traditional "this would make the math beautiful/elegant" that weve been using for 50yrs.
Didn't Godel prove there is no such algorithm that can explain all of mathematics. Thus making a theory of everything a null argument. Wouldn't a "theory of everything" be everything?
@@techsuvara God, it's been a hot minute since I studied that, but I believe Godel's theorems are about completeness/incompleteness. Basically any system of logic (math included in this) can either choose between "Being able to prove everything" or "Making sure everything you prove is actually true". What he proved was that if you have one, you can't have the other. If you can prove everything, it means every true fact should be able to be proven, but it also means you can technically using that same system of rules/logic, prove things that aren't true. And on the other side, if your system only proves true things, ie. every single thing you prove to be true, is in fact actually true. Then that means there will be some true things, that your system, no matter how you manipulate the rules and logic, will never be able to formally prove. Put another way if your circle is big enough to have all the true things inside of it, some false things will be also in there. And if you shrink the circle so that all the false things will be gone, then some of the true things will also be outside the circle. Ie. you can't have a circle that has all the true things inside, and false things outside. (This is a gross oversimplication but the general idea holds). I do not believe it actually says anything about "computation" itself. It shouldn't effect any computation that describes all of reality. I think the confusion may be confusing the "theory of everything" with the idea of "every single theory". Godel's theorems talk about "every single theory" but not the theory of everything. There is a weird subtlety to these ideas. ie. what does it mean to "prove" something. But some of it is more semantics.
@@aggies11 a simply way to put it, is that any sound formal system has limits on what it can express. Any formal grammar (such as a language grammar), is also a formal system. From my research, it's related to recursive and self referencing relationships. One interesting thing is, quantum mechanics are expressibly in formal systems too... My intuition is, no computational model can describe our reality. This is because reality exists outside of what we observe externally. That's just me as a panentheist.
@@techsuvara Yeah I think that's basically what I remember. But that's about whether something can be "proved" not necessarily expressed, right? And to compute something you just need to be able to express it? So there are math statements that we can express, but can't actually prove that they are true or not. So as long as we can express the math of a possible "theory of everything", even if we can't prove it's "trueness", it's possible to still compute it? I do remember the halting problem, but I don't think that's related. Undecidability I think it was called? I think that's separate from incompleteness.
I've had difficulties with both Wolfram (since the 2002 book) and Wegmark (I view his "Mathematical Universe" as a close analog to automata theory) in their inductive approach to the logic of theory building. It seems to me that if you: 1.establish your first principles in a way that has intentions of producing a known result; 2. continually tweak the model when the iterations fail to produce that result; 3. stop iterations once the desired results are obtained, then all you seem to do is reproduce the previously known results with your "new" language. Thus, even if these inductive processes lead to the answers the researchers are looking for, they have not created anything explanatory or "new."
Actually, this is how a large part of research mathematics proceeds. The results of this method have certainly proved valuable in very many cases -- very many of the modern definitions of mathematics, for example. Furthermore, if the new language is simpler than the old one, I believe the "rephrasing" can indeed claim to have explanatory power; even if it is not, a new perspective can often prove useful.
It has all one simple fundamental philosophical assumption at it's that that if we get a model explaining everything we observe we magically start to know he nature of everything in entirety. It's like having the polynomial fitting a number of don't of a non analytic function and thinking that we now have a formula describing the functions completely. Wrong wrong. It will be wrong the moment we leave the realm of already known observations. Such theories should be measured on technical applicability and predictions they provide. We have already tools that provide us with a quite good description of what we observe.
Computation is only incremental if it's digital in some way... Analog computers have no problem doing continuous computation. Also, QM and GR are already known to be fundamentally incompatible, so we already know that there's something wrong with one or the other.
Or perhaps there is a presently unknown bridge theory between the two that requires another Einstein. The problem not being that either is wrong, but that a piece of the puzzle is missing entirely at this time.
Analog is really fundamentally discreet- you still have bandwidth limitations. What we see as analog is just very zoomed out discreet transitions, both in frequency, bandwidth, voltage, etc.
I really like Sabine explanation. I would ask him what result can his theory can predict that aren't already covered by the other fields or yield different results.
What if we stopped looking for the Truth about the universe and started looking for useful frameworks to predict specific things that happen in the universe? If we did that, maybe efforts like Wolfram's wouldn't seem so threatening to career physicists. Maybe this tool, or some pieces of it, could be useful in some instances? Then maybe we'd all give up the dream of being the "next Einstein" who apprehends the mind of God, and we'd make progress towards predicting and influencing the future, which is mostly all science can do for us anyway.
Stopping looking for one thing does not make you more efficient or successful at looking for a completely unrelated thing. There's no reason not to look for the both the practical and the truth.
What if there can't be a "Theory of Everything", and that the nature of the cosmos is such that it can't be understood or described from a single viewpoint or mathematical expression? The obsession with a TOE seems unreasonable, given that we are unable to even correctly estimate the number of stars in our own galaxy.
@@SabineHossenfelderphysicists are working on this, though they are not working with wolfram. It's called constructor theory. Time, space, particles follow certain simple rules defined in contractors and the system evolves according to the "simple" rules and constraints encoded in the constructor which then runs in a program.
I've been listening to the Lex Fridman podcasts where he talks to Wolfram at length about the Wolfram Physics Project. As a programmer myself I found the explanations really clicked for me. Very exciting. I've been waiting to hear your take on these simulations.
they are tired because the problem is hard and Wolfram and co do not come closer to explaining it merely by showing that a whole new approach is possible. he does not even make unique predictions that are potentially falsifiable
@@shmuliknemanov4009 That's bit of a "darned if you do, darned if you don't" situation. Would he seem any more credible if he did make unique predictions using novel methods? No, and there's already plenty of papers out there that have been torn apart for doing so. Because it has to first start out showing that it lines up with observations that have already been made, to show it isn't just some unserious attempt at attention-seeking from an amateur scientist.
@@Vaeldarg i think your definition of unique prediction is not what i had in mind. by unique predictions i mean the expansion of the explanatory power of a current scientific model. this happens when more facts can be modeled with the same number of or less rules. once this is achieved it is not such a trick to point out that an alternative theory can also do the same expended modeling. it needs to to account for even more facts then the newer theory Wolfram has not done this. therefore he contributes nothing to science. instead he is making a -not very original point- about the philosophy of science.
The answer is obvious. the observer needs to be outside space/time to not influence the results. This implies that universal constants are also multivariable. We can already see a hint of this in the speed of light C as constant but only if it's not through a medium. Gravity, atomic forces, quantum effects are all variable. You only need one algorithm to define 3 points, 1. outside observer.and 2 points in space at a given 4d distance.
Way over my head even conceptually….. but somehow very enjoyable to hear about a different way of looking at our “reality” - model to explain how our Universe/ existence works?
Than give Donald Hoffman a shot. That's way easier on surface level and way more mind bending if you willing to listen. No, I'm no way smart enough to comprehend the mathematics behind it, but the idea is very, very interesting.
What a fun vlog today, I liked this one very much. As a former practicing geophysicist and geologist, now an IT guy due to most mines going extinct, when I heard this talk about Wolfram, it caught me how many similarities there are between his graphs and late Stafford Beer's "Viable System Model" (VSM). Beer created it to help Salvador Alliende and his finance minister to reorganize Chile, however that went very south. However, VSM is big in Informatics, to describe system thinking. Yes, it has nothing to do with either physics or quantum physics, but there is a twist, as described by Patrick Hoverstadt in his "The Fractal Organization." VSM is fractal, something that reminds of Wolfram's ideas. And VSM can be used in nature science, though only some have observed it. Two that have done so are the two social theorists, Lave and March, who, in their seminal theory-building book, "An Introduction to Models in the Social Sciences," used how a sedimentary profile can be analyzed as a model to build social theories. As I read this, it became apparent as a former geoscientist and amateur astronomer that I could use VSM to describe the Universe from the Big Bang to Earth's biology as VSM fractal objects, similar to what Wolfram seems to do. Absolutely not the intention of Beer, but I have used it several times in IT lectures because it is easier for the students to realize the total involvement and relations of everything and to explain the Universe in astro outreach. Works perfectly, though, with no Lorentz limitations or so. Maybe Wolfram and his cooperators are on to something. 😉 Thanks for this vlog.
I'm a developper and Latin American leftist, yet had zero idea about mr Beer, his work, nor his collaborations with Allende. I went onto a deep rabbit hole search after reading your comment, and learned an enormous deal from it. Thank you so very much
I'd like to hear the conversation between Gorard and Hossenfelder. Although I don't understand a thing he is talking about, I am convinced that just listening to him can make you smarter.
There is so much observer dependence in QM and GR, that sometimes it begins to look like some kind of multiverse. I'm in my region of the multiverse, and you are in yours. There are ways to map what I experience in my observer centric universe into what you experience in your observer centric universe. Our universes aren't isolated from each other as long as we occupy communicating regions within our light cones. Just a fanciful way of looking at things...
I'm a software engineer, and while I'm no expert in graph theory/algorithms, it makes intuitive sense to represent the "fabric of reality" as a graph. In a video game, fabrics are literally represented as graphs -- or, at least, if you want any good representation of a fabric. The edge of graphs in this sense don't need to have a physical length, they're just an abstraction for the relationship between two nodes -- an observer and the thing that is observed in a sense. Like I said, I'm not a physicist and so I have no real idea about the laws of nature and what the established body of knowledge states, and I may have misunderstood your critiques, but if space-time can be an emergent behavior -- it being the result of a calculation based on two related points of the graph -- then the act of calculation being the thing that produces space-time would imply it's not constrained by it, no? I'm not sure it being an iterative calculation is a good counter-argument as computers are iterative due to the nature of the constraints placed by the physical world. That being said, I think video games may be a good advocate for an iterative calculation as we have tricked ourselves into thinking the images displayed on the screen are a continuous stream of data when they're really just fast-moving still frames. We may not have sophisticated enough measuring tools to say there isn't some iterative calculation. Maybe that is why there is a speed limit to the universe -- the speed of light is simply the "frame rate" or "clock speed" of the universe over a given amount of time/space. I think this would also make sense as to why things tend to behave as waves, also. As each point of the graph observes the computation, it then communicates this change to the nodes it's connected to causing a pulse-like behavior. I have no idea how this would explain the "spooky action at a distance" that QM does. It could be that there is a connection between every single node in the graph and that only specific types of computation affect a select few connections of those nodes. From a computational perspective, however, that's very resource-intensive. Having not only X^2 (I think?) connections in the graph would take up a huge amount of RAM, but filtering those lists to only affect (an assumedly) small and specific subset of neighboring nodes is computationally-intensive. But that's also from the perspective of traditional computers. I'm no physicist, so this could very well by just naive rambling, but it does make intuitive sense from a computational standpoint.
> it then communicates this change to the nodes it's connected to causing a pulse-like behavior. That's exactly what's happening. And an object moving in relativistic speeds can't be updated as quickly, and thus it experiences time dilation. This says that time does not really exist, time is simply the measured rate of change of things. This also explains the upper limit for speed, which is the speed of light: pulses from one node to another can only have a certain fixed speed, thus creating a maximum speed of propagation of state between the points. > I have no idea how this would explain the "spooky action at a distance" that QM does. QM is wrong, spooky action at a distance does not happen. The particles have mechanisms inside them which alter their properties proportionally, if they are 'entangled', i.e. created from a common source. That's why there is a correlation measured that breaks the Bell's inequalities. If you try to create a program that simulates entanglement, you will realize that the modification the particle happens when the particle is in flight.
Not that in throwing in my hat with Wolfram's hypotheses, but Einstein's relativity, particularly General Relativity, is a bag of problems in and of itself that requires a lot of... massaging... to get it to describe reality as we believe it is. So if compatibility with General Relativity is the hangup, why bother trying to make it compatible? GR isn't etched in stone. It, like all mathematical hypotheses, is a logical approximation. By setting it aside entirely, another hypothesis may (or may not) find a better set of ideas.
We know that GR is missing some pieces, and we hope that someday it will be set aside in favor of something else, but that only happens when a new theory can explain *everything* that GR explains plus new things that it can't explain. I think Sabine was saying that Wolfram's previous stuff (the one where the graph lines have lengths) can't explain things that GR does explain, therefore it's not a candidate to supersede it.
I can't see how it's reasonable to describe General Relativity a "bag of problems" that "requires a lot of massaging to get it to describe reality as we believe it is." That just sounds like bunk. GR is THE best model we have at cosmological scales and thus far the ONLY model which describes gravity accurately with any relationship with experimental and theoretical physics that we know of. It would not be wise to discard it as a "mathematical hypothesis that can be set aside entirely."
@chriscurry2496 What is "space-time"? Aside from a mathematical coordinate system that is. If light is traveling as a wave, in what medium is that wave propagating? I'm not asking for formulae or specific values. What is it? Just another name for 'aether'? Or is there a fundamental physical 'thing' we call 'space-time'? I'm not attacking GR. Just because we have a formula to approximate reality doesn't mean it's an actual description of it. One of the chief problems with "science" as respective fields of study in modern academia is that it creates fan-boys who are emotionally incapable of accepting either shortcomings of their fields or tolerate the possibility of falsification, however 'falsifiability' is a fundamental requisite of "science". If an alternative hypothesis doesn't instantly disprove EVERY aspect of something it is immediately cast aside and ridiculed. GR, the Standard Model, evolution by natural selection, these things didn't reach their current potential in a vacuum or instantly or without DECADES upon DECADES of further study, hypothesizing, calculation, and observation done by countless people. 'Theories' don't spring up fully formed. In many cases, GR returns a divide by zero error. Every 3rd grader knows that you can't do that. So we've had to come up with ways to cope with various kinds of anomalies. Setting GR aside for the sake of pursuing another potential hypothesis isn't an attack on GR. It's admitting we don't know everything and a new direction may yield new information.
@@leandercarey "What is space-time? Aside from a mathematical coordinate system that is." I'm going to tell you the right answer, but you're probably not going to like it or accept it. And if you can't, then we're pretty much at an impasse. In fact you and modern science is at an impasse ... So you've repeatedly asked what this space-time "is" by explicitly rejecting mathematical descriptions. This reveals, if I'm not mistaken, a deep bias which will probably make reality outside of some simple stories about reality allude you. For, what kind of answer could possibly be acceptable to you if you reject the one which describes, with real number precision, with geometrical precision? Like, someone could describe it with an analogy--say, I could tell you that space-time is like a spongecake, and it warps and bends and it's curvature is what we call "gravity." But that's not REALLY how it is. How it REALLY is, is exactly how the mathematics REALLY describes it--that's IT. That's as precise and as real and as fully fleshed out in all its predictive power that you can ever get about ANYTHING. If you reject that, asking for what it REALLY is (but not with some functions or linear operators--puh-lease!) is like asking what a car really is without the engine and the frame and the windshield, and etc etc etc. And because you're apparently not so fond of math, let me defend the singularities inherent in General Relativity ... You see, it's not that there are "divide by zero" issues. There's a field that EVERYONE should take or be taking called Calculus that shows you how to handle that with another "evil" mathematical concept called "limits." You can apply these limits to differential equations and find that you really don't have to divide by zero ANYWHERE within Einstein's beautiful partial differential equations (despite their truly evil non-linear nature). And what you find then is that the singularities are a thing of beauty in and of themselves. They show you where the theory breaks down, and needs more. In NO WAY does that justify "replacing" relativity (you went on about "fan boys" or some noise and I didn't really care to follow that closely). In fact, it's a strikingly rare case where Einstein's theory actually predicts spots where one should look to advance the theory, because it's insufficient (ONLY in those spots, you see?). Even if we WANTED to replace GR, which we don't, then nothing even comes close to doing what GR did, does, and will be doing in the future for a long, long, long time. It is very likely that General Relativity will NEVER be replaced, even if and when we find solutions to singularities which require enhancing the theory, or enhancing Quantum Mechanics or something like that. It's just that good and useful.
@@leandercarey --- One of the chief problems with "science" as respective fields of study in modern academia is that it creates fan-boys who are emotionally incapable of accepting either shortcomings of their fields or tolerate the possibility of falsification, however 'falsifiability' is a fundamental requisite of "science" --- So I wonder where statements like this even come from. Where are you getting your information about how "science" works in "modern academia"? To be honest, this sounds like some kind of parody or nonsense generated by social media where you don't need any sort of standards for just telling true stories. --- If an alternative hypothesis doesn't instantly disprove EVERY aspect of something it is immediately cast aside and ridiculed. --- HUH? In what world does THIS happen? It honestly sounds like you've been huffing that Brett Weinstein bullshit. Everybody should seriously stay the hell of that. It just leads to nothing but a bitter headache and hating everyone, apparently. -- Setting GR aside for the sake of pursuing another potential hypothesis isn't an attack on GR. It's admitting we don't know everything and a new direction may yield new information.--- Again, dude, I honestly don't know what you're even suggesting here. You can "set GR aside" all you want. Have at it. I'm just telling you that at the end of the day, ignoring it isn't going to help your theory none.
Obviously, I don't understand Wolfram's idea at all, but my intuition is that we can always only think about the universe in terms that are familiar to us. First it was gods on Mt Olympus who were responsible for everything, then we had the concept of the clockwork universe once we perfected clocks. Today people again try to cram the universe into boxes that they are familiar with, be they probabilistic, multi-dimensional or computer code. From the sound of it, Wolfram's idea seems to be promising. An algorithmic universe has a certain appeal. But that doesn't necessarily mean that it leads to a simulation. Maybe a computer simulation is just the closest concept to reality that we can think of at the moment. maybe the universe approximates an algorithmic universe, just like it approximates a newtonian, mechanical clockwork universe, only slightly better.
Yes! I'm not sure Wolfram is making a literal ontological claim about the universe being a hardware computer running a simulation like the Matrix movies. Only that it functions in a similar way? Difficult. Fascinating too.
@@thejackanapes5866 The problem is that people will take a model's metaphor and misunderstand and attempt to literally apply it. QM and "many worlds" for example.
Hi, I am Shon Mardani and this is my Unifying Theory of Everything: [GOD] Created NOTHING, a Void and Empty Point in Space. NOTHING Attracts Surrounding Space, this attraction or Pull is the Only Law of Nature and it gives NOTHING its Property to be a Particle which I call the GRAVITATIONAL PARTICLE (GP). When [neighboring] Space moves into the GP, it Creates its own GP at that Vacated Space which Attracts its own neighboring Space. The high speed of the moving GPs will result in duplicating and creating new GPs which propagate and grow like a Crystal. This growth is in 3 Axes (x, y, z) and 2 Directions (- , +) on each Axis, therefore the numbers 2, 3 and their sum 5 are the Magic Numbers of Nature. The Propagation or Growth of the GPs Create Closed Cyclic Patterns/Paths/Traces as Locked Loops. These Cycles Create the Atoms/Matter/Existence starting with Hydrogen. Hydrogen Atoms Collect to Form Nitrogen and Oxygen Atoms of the Atmosphere at the ratio of 4 Nitrogen to 1 Oxygen which I call ATMOSPHERIC UNIT (AU), then Gravity rearranges the GPs within the AUs and the Carbon Atoms are made. Atoms Connect with each other by Overlapping/Shared GPs (single, double bonds ...) to Create Molecules. Gravitational Particles move/travel within the Atoms and between the Atoms (intra and inter atomic) which I call it GRAVITY CURRENT and it moves in a Circular Path toward its Center of Gravity. On the Earth the Gravity Current is perpendicular to its surface and is moving toward the Center of the Gravity of the Earth and its Connected Atmosphere. The Gravity Current on the Earth's Surface is felt as a Push from above and Pull from below [Earth's Center of Gravity], it feels like being under a shower from a firehose. Atoms/Matters because of their GP Loops are the Physical Resistance for the Gravitational Current. Resistance to GP Current by Atoms manifest itself and are measured as Weight/Mass, Electricity, Magnetism, Heat/Temperature, Light/Color and all other Physical Properties of the Atoms and they continually convert to each other. GP Current moves from one Atom to its neighboring and connected Atom, this means Conductivity is required for Gravity Current, all its Manifestations and the fundamental Existence of the Atoms. Gravity Current and all its Manifestations are Measurable which means we can Count the Quantity of its Constituent Units like GPs and the Cycles and Loops of the Atoms which make the Gravity Quantum. Since the Atoms are the collection of GPs locked in the Loops/Cycles, the Quantity of the GPs in an Atom determines its Shape (Atomic Numbers), and the 3D Positions of the GPs determines the Orientation of the Atoms or their TIME. Gravity Current continually rearranging the GPs in the AU Atoms and changes the Atom's Orientation and results in Embedding or Coding the Time in the Atom, like Stamping the Current Time in Atoms. The Interactions between the Fundamental Organic Atoms (Hydrogen, Noirogen, Oxygen and Carbon) with Different Embedded Times (timestamps) Creates LIFE. Breathing, Drinking and Eating the Organic Atoms and Molecules with Newer Embedded Times provides Energy to Sustains the Life and the Energy to Move, it also provide Energy to Synthesize Heavier Organic Atoms like Na, Mg, P, S, K and Ca by Living Organisms. To understand the concept of GP, imagine there is a room full of Marble balls with no friction between them, now remove/make disappear One marble from the middle of the room. The empty space created by removed marble attracts the neighboring marbles to fill in the empty space, however only one of the 6 possible adjacent marble can move in and fill the empty space, the one on the left, right, front, back, above or below. One of the marbles moves into the empty space and by that creates its own empty space and attracts its own neighboring marbles. The Path/Traces of the Moving/Propagating/Growing Empty Spaces (GPs). We do not need the existence of any matter in the space to move into the GP, it is not the Matter which Moves, it is a GP which is Created in that Space and moves to the other GP. It is like the positive side of a battery which does not exist without the negative side of the same battery, so if you create the negative side, the positive side is created and will consume the negative side unless it is connected to the other batteries. I have developed this Theory of Everything by studying the latest knowledge in physics, chemistry and biology especially the Periodic Table in the last many years and have done my best to validate it and find any conflict with observed and experimented fact. I appreciate your feedback for me to validate and correct any errors there may be before I get into details, thank you.
Well, hypergraphs are just good objects in general for modelling a problem domain. It's an area of general research with a lot more applicability than we give it credit to, it applies not just to physics, but computer science, chemistry, etc. Hypergraphs are just a good language and offer a generalizable structure to reinterpret existing problems and solutions, as well as use hypergraphs to find new and interesting ways forward.
I appreciate your well-reasoned skepticism. It is so rare these days. Please keep commentary like this coming. I feel that it is a great service to the reasoning public.
When I first watched a Wolfram lecture it triggered me to take a core tenet of what he was saying and extrapolate it into a basic truism: All occurrences in the universe, from origin to today, from quanta to galactic clusters, can be seen as a series of interactive algorithms which build on each other to create an algorithmic universe. When any two particles join they have created an algorithm that will now react to inputs from their environment in specific (now patterned) ways. As particles continue to join together in higher complexity, they form more specialized algorithms: nuclei and electrons, atoms/elements, molecules, compounds, geology, life, biospheres, planets, stars, solar systems, galaxies, the universe. All is a Structural Natural Adaptive Networked Algorithm. All things remain interactive with each other in grand interconnected algorithms (for example living forest soils). The universe is a grand interconnected adaptive algorithm. Some Natural Adaptive Networked Algorithms become relatively stable. Examples: The Sun. A black hole. A rock. A marble. There is a spectrum of stability vs complexity between a rock and weather. In high gravity systems like the Sun or a black hole, complex algorithmic interactions are crushed and incorporated into a relatively simpler algorithmic system. (However this is reversed in stars that go nova or supernova, thereby restarting a new cycle of building to complex algorithmic interactions). In solids like rocks and marbles algorithms get locked into a very slow moving and outwardly interacting matrix. ORIGINS OF LIFE We can also look to the process of algorithms building and changing in the first biological life (and in the pre life just before it - molecular algorithms). All interactions between living beings are likewise natural adaptive networked algorithm interactions, but with highly improvisational, almost stochastic complexity in possible outcomes (like weather). Genes, chromosomes, genetics, are natural adaptive networked algorithms. Genetics combined with Epigenetics form higher complexity Natural Adaptive Networked Algorithms. (This is the last nail in the coffin of Dawkins' Selfish Gene theory.) Religion is also an adaptive networked algorithm (manipulated by human designed algorithms/memes). And thereby we can riff into the algorithms of culture and even computers and computer programs themselves, coming full circle back to where Wolfram started us off.
Very interesting and I agree but I think the breakthrough comes when it's discovered WHY this happens and not only WHAT is happening. But the WHAT contributes greatly to progress.
@@void________ True. But the question of **why** essentially gets us to the fundamental question of what existed and happened before the 'Big Bang' which is as difficult to parse as the question of unifying relativistic and quantum physics. And of course both questions likely have the same answer. The only conception I've been able to imagine so far is that before the 'Big Bang' the only actual physical thing in the universe was gravity itself and what I mean by 'gravity' is a uniform self attracted substance (what could be colloquially described as a gummy magnetic 'plasma') rather than just a 'force' which was at first in equilibrium but then fell out of equilibrium and in doing so began the process of the 'Big Bang'. Here's the concept: Picture for the sake of argument at the 'beginning' of the universe a uniform ball of this 'gravity' in a stable equilibrium, which either inherently could not remain stable, or was disturbed in some way. Parts of that 'gravity' plasma then asymmetrically and differentially separated from other parts and then packed down tightly into sub-particles that were internally convoluted in varied ways which made them differ from other packed down gravity 'sub-particles'. This separating and packing down, was then followed by periodic uneven-pressure triggered explosions back outward, followed by even more complex separation and packing down, with the entire new collection of **everything** all crunching down together in on itself and then exploding back out (possibly many times in succession). This process could eventually create all of the properties that we now see in all matter/energy (including repulsion) - and in turn the universe as we know it. I'm a lay person and don't have the sophisticated mathematical knowledge and skills to put this credibly on paper, but I'm hoping that eventually AI might help me do so. Regardless, any of you who are trained physicists, who read this want to take the idea and run with it, be my guest.
Never thought much of the movies and that joke is soo lame. Perhaps the books were a lot better. Maybe the joke is just to good for me to comprehend....
@@deathorb to say the movies didn't do the original work justice would be the understatement of the century. the reason why all of the hitchhiker's guide quotes are so famous is because the books were one of a kind and unmatched ever since in more ways than I have the time to explain in a comment. If you're judging the series from only having watched the old movies it is understandable why you don't understand their significance.
The most interesting aspect of Wolfram's work is the idea that the complexity of our universe is emergent from basic patterns. That in iself is worthy of much more investigation.
One of the many things I love about Sabine is her humor. In the midst of trying to explain all this she throws in a 7:10 "while trying not to go insane". It completely disarms you for a moment and then back to the discussion. It allows you to take a mental breath and then back to the subject. Always timely. It allows me to understand more of the physics, not being a physicist, without giving up. Thanks Sabine
I like this because I like all science and now try my best when it comes to science, be as open minded as i can and not to be biased. The universe is what it is. Whether it is perceived in different ways, it can certainly be explained in different ways, each way simply being a language, different from the next.
There's this video of Jonathan Gorard giving a presentation to a bunch of physicists from a few months ago. I found his talk very informative and not too difficult to follow although it could have been a bit more organized. But the other participants in the call really showed a fundamental confusion that must have set in right on slide 1 - given the kind of questions he got. I found this very illuminating and kind of funny. I mean graphs are not exactly rocket science but they were so stuck in their old ways of describing things that it would have taken a few or so to get them to understand even the (rather simple) fundamental ideas. Poor Jonathan stayed extremely polite (he's British after all) but he must have felt like someone explaining the way to the airport for 15 minutes and then suddenly realizing that his counterpart didn't speak English.
I've been thinking about something akin to this approach for many years. A friend of mine (PhD logic and philosophy, but his undergrad was physics at Caltech - my undergrad was also physics, Harvard) and I thought that one way to approach the preferred basis problem from an Everettian point of view was to consider the idea that the reason for the preferred basis is that it is the only basis in which one can form the sorts of causal relationships (i.e., somewhat closed feedback loops in induced spacetime) that you need for perception and cognition (i.e., life, as we know it). Even the simplest organisms require this sort of causal feedback and we know that the preferred basis is the only basis which has local spacetime. If you think of the universe in causal set terms, one might then conclude that there might be a way to explain why spacetime appears to have the structure it does (Minkowski-like manifolds) because that become an optimal way to obtain the apparent causal relations when constrained in this way (i.e., the requirement that one has to be able to have perceptual/cognitive feedback loops of some kind). In order to work this out one would naturally start with SOME notion like causal sets - causal relations but not *distance* relations (distance would be an emergent phenomenon, not something built into the network/graph).
What does it mater what the observer sees? The object's size or energy doesn't change, just the observer's perception. Relativity should mean that observations depend on the observer, but they don't affect the object being observed, just the perception of it. It's kind of like rumors and facts. Rumors may be derived from facts, warped by the people observing them, but they don't change the facts.
Ok, but then how do we determine what standing perfectly still is, so that we can have a baseline energy for some wavelength? And why is it that when I drive a car going 20 mph towards another car, going 20 mph towards me, it looks like it's going 40 mph and I feel as though I'm standing still, but no matter what speedy vehicle I attach my flashlight to, I can never get the light to come out faster than C? My light sensors don't see the speed of light from the headlights coming towards me as C+20 mph. The observer doesn't change the universe, but nothing in the universe has a "true" size. And this is all totally separate from the expanding universe stuff. All we know is no information seems to be able to travel faster than the speed of light. The reason photons can travel that speed is they have no mass. More mass = more energy to accelerate something.
@@Thomas-gk42This to me says that all observers are equally delusional. I suspect there is a reality that cannot be observed, but physics does not want to deal with this, because it is pretty much impossible if you adhere to the scientific method. Amazingly, what is observable has increased over time, with better sensors and so forth, so there may be a lot of things we have never even imagined, much less observed. Science is not the only path to truth.
@@zunuf I get it that C is the max speed, just like the Plank is the min (length, time, whatever) But why would you need a baseline? it all depends on you and the object you observe. If it's light, it already moves at C, so the only thing that can change is the color. If it's a particle, and both it and you move towards a point of head on collision, if the sum total of speed from you and the particle is C or greater than C you would "see" it moving at C because the information's max speed is C, But that does not change your speed or the particle's does it?
Glad to see this work is getting the open-minded attention it deserves. The way I see the work, he’s more or less exploring the consequences of a specific, but simple mathematical axiomatic system - the computation of updating rules on a hypergraph. The thing about it is that it is capable of expressing any discrete computation. If the universe has discrete properties, then by definition his system can compute it if you find the right rules and initial conditions. The main challenges are finding those rules, and identifying when you have done so. So it’s a little bit like someone taking ZFC and saying it’s a “new kind of physics” because by developing enough theories you can find some with symmetries that mirror the symmetries of physics. In the end, his mathematical system is probably equivalent to just about any other mathematical system, so the main benefit would be if it is somehow easier to compute or evaluate physics-like theories. One neat thing about his work is that QM and SR pop out fairly early on as a consequence of making the system more general and removing assumptions. Another is that it’s fairly straightforward to generate and simulate any ruleset. One area that’s most concerning to me, as far as its promise for physics, is rotational symmetry. I don’t understand how to get rotation invariance from a spacetime composed of a graph. His explanation seems to be that rotational symmetry is a limiting approximation, but as you point out, this should still lead to detectable quantum effects. Maybe the work resolving Lorentz invariance that you referenced also addresses this?
This actually makes a lot of sense. The issue with progressively completing a row of data in the illustration is that it subtly relies on the intuition that everything is constrained by time as some sort of privileged dimension - you expect to see the information propagate over time, and you assume that the row in front of you is a single dimension with no information contained across the Z axis or beyond. But ultimately any Turing-complete process can take any amount of logic, logic which may 'manifest' as multidimensional models, and represent it in a single dimension. And what happens in a Turing-complete program? Nothing all that special - just a massive tangled web of memory pointers. And if you 'decompressed' and stretched a program out into a shape so that you could visualise the states of those pointers, you have your graph. Our arrow of time isn't necessarily shared by the fundamental logic of reality, it just becomes an emergent dimension, and there's no resulting need to be able to observe the propogation of information across this graph, because it's the other way around. Our experience of time is completely divorced from the relationship between these pointers. Now, my issue with everything is this: I'm not convinced that this is really a 'theory of everything'. At best, it's a representation of the lowest level fundamentals of reality, however if we make the assumption that anything which occurs is governed by some form of calculable logic, then every physical process is already theoretically Turing complete... and all this theory does is articulate it in a mathematically coherent way. But it doesn't really reveal anything about what the properties of the universe *are*, or bring us closer to understanding the reality etched into this fundamental fabric. It just shifts how we represent information in a wildly complex system, without giving us the tools to make new inferences. It doesn't give us a new model, it's an unfalsifiable conjecture which will always remain possibly correct so long as everything in the universe has some basis in logic, which you can always argue is the case because it's real and it's real *somehow*.
Interesting. If we avoid calling it "time" and then think of it as causal relationships, i would say computations still have this. Even the simple computation of a mathmatical expression has inherrent causal relationships ie. Multiplication must be performed before addition, brackets and exponents before that etc. There is a concept of before and after that if not obeyed, changes the ultimate result of the computation. And its these causal relationships thst are caputured in the networks/hyper graphs. And i think sabine woukd agree with you that there is no rule that says the universe must have one single theory if everything. There could be several discrete sets of rules that dont overlap. However if we look at the observation thst nature seems to prefer the lowest energy states than an Occam Razor type approach can be attractive. If we had to choose between a single fundamental structure to reality by which everything elese emerges naturally, OR several different seperate fundamental systems that somehow interact but dont contradict each other. Which one seems more likely? Noe unlikely thingd happen all the time, there is no rule that says irs only always tue most likely outcome thst occurs. BUT the fact that its the simplest explanation does have a draw to it.
@@aggies11 The order of operations in mathematics is just a convention tied to the syntax we use to represent expressions, but they're not inherent. You can devise an equally valid way of writing things down in such a way that multiplications follow addition, but to retain equality you'd have to rewrite it all. Additionally a causal relationship, at its lowest level, is a reversible operation, so there's no inherent direction to the flow of information across a graph. If I were to draw out a graph with three states: 4---5---6, you might see it as an upward progression, but that's not a property of the graph, that's just a result of reading it left to right and inferring that we're adding one each time. Someone who speaks Arabic or Hebrew might look at it and infer the opposite. Both are right. That's what I meant by the arrow of time as we know it being a distinct phenomenon to the flow of information as dictated by the absolute lowest level of reality, however there would still be 'an' arrow of time across which a lot of the logic unfolds independently of time as we experience it unfolds. I didn't mean to suggest that Wolfram's framework for devising models of reality could coexist alongside others, as that would contradict the nature of his entire theory.
@@minikame2272 ah, ok, then we were in essence saying the same thing. For me it wasn't the direction of the "4 5 6" in your example, but rather that there is a causal relationship between them (ie. 5 must always come between 4 and 6, we can't start or end with it).
Hey Sabine, I love your videos! You are probably aware of the equivalence between different well-known models of computation (Turing machines, Cellular Automata etc.) that require a global or synchronized clock. There is a less-known approach to computation based on interaction nets (or interaction combinators) that model computation as rewriting a graph structure which is similar to rewriting expressions in lambda calculus. However, interaction combinators allow asynchronous concurrency, meaning that the graph rewrite rules can be applied on different parts of the graph and the order does not matter since the computation and final structure stays the same -- similar to how in relativity different observers may not agree on the order/simultaneity of events but the physics stays the same. I hope their research is successful or creates useful technology.
Yes! Thank you, Sabine, the energy carried by photons _is_ observer-dependent! I picked up on this after seeing an interview with the indomitable Sir Roger Penrose (a man who must surely rate as one of, if not _the,_ greatest living physicist), but, like so many other things in physics for the avid layperson, it's always good to have an idea you've picked up on made more concrete by another practising physicist. And with her little digression about photons Sabine did just that for me, and literally only days since I started to contemplate it... satisfaction! I was stunned when I heard Sir Roger mention that in the very early universe when energies were enormous and high-frequency photons were blazing around everywhere, there was a problem with this picture. The problem was that if you could zoom along next to one of these photons, then you would just see a regular, low-energy photon, and not one of the super-energetic ones that were supposedly carrying much of the universe's energy with them! This wasn't what concerned me; however, what I was thinking was that you, me, and every other living thing on this planet owe our existence to the low entropy of the Sun and, more specifically, to the high-frequency, energetic photons that we receive here on Earth. Plants then convert some of that energy through photosynthesis, and through that conversion, they grow in size. We then eat those plants, or else we eat the animals that eat those plants, and then our cells use the energy stored in our food to create ATP molecules, and we then use those ATP molecules to create an energy gradient - to keep ourselves far from equilibrium - and this is how we all struggle-on, somehow, however unlikely or difficult it may be to keep fighting the relentless march off entropy, we do it. You, me, and every other living thing on this planet owe our existence to the fact that the sun is a bright, hot object in a cold, dark sky. This is what keeps us going, this is the _only thing that keeps us alive!_ It's the _difference_ that counts. It's the energy gradient. That's what gives the Sun a lower entropy than the space around it, and we then use some of that low entropy in order to keep ourselves as far from equilibrium as we can, because equilibrium is high entropy, and high entropy is death. Of course, we don't actually _gain_ energy from the sun, because if we retained any energy that the Sun gives us each day, then the Earth would become uninhabitable in fairly short order. And so all the energy that the Sun provides during the day is emitted back into space at night - for every high-frequency photon that enters our atmosphere, 16 low-frequency photons are radiated back into the dark depths of space at night. And _this_ is what I was thinking about - if we owe our very existence to being able to harness the Sun's high-frequency photons to create an energy differential, but those photons only have that concentrated energy, that energy that we _need_ in order to stay alive, because of how we _see_ those photons, not because of how they inherently _are,_ then, well... isn't that just a bit whack?!?! Your heart is beating right now purely because of the way we look at photons arriving here from the Sun. Nothing more, and nothing less. Anyway, _I_ thought it was whack. In fact, I thought it was absolutely bat-shit crazy! But then discovering that we are all being constantly accelerated _up_ into space at 9.8 m/s² by the internal pressure of the Earth, and that this effect is what we mistake for gravity near the surface of the Earth, wasn't the easiest concept to get my head around either! Physics is weird, the _world_ is weird, living in the world is _weird,_ and really, the whole damn show is _weird._ I mean it's very weird... we are alive because of nothing more than how we look at the sun... it's bloody-well bat-shit crazy! Anyway, have a great, sunny, day!
" energy carried by photons is observer-dependent!" not really the energy of the photon is fixed. Its just the energy released is dependent on the differential velocity between the photon and the object. The variable is the objects speed in relation to the photon. Sabine's Example about the photon is a poor one as it does not reflect Wolfram's model.
Saw Sabine and Roger live together on stage, the HTLGI-festival in London, last month. The two most honorable scientists, humanity can count on currently. Great event.
"but then discovering that we are all being constantly accelerated up into space...and that this effect is what we mistake for gravity.." It's almost like you're simply wrong about how a lot of this works, but through confidence bias you're able to tell yourself otherwise. You wrote this either hyped up on meds, or in need of them.
@@Vaeldarg Well, you're half right at least; I mean, I _was_ hyped-up after seeing the video, and, in my defence, it _was_ about 4 am when I wrote it, so, you know... a little overtired, perhaps. However, as to your primary objection here, namely, that I was factually wrong about a physical process, well, I'm sorry, but on this point, it's you who appears to be wearing the scrambled eggs. And quite a lot of them too, I might add! I can say this with the utmost confidence because a.) it has been a very well-known and understood physical phenomenon ever since Einstein's general theory of relativity was first published. All bodies in contact with the surface of the Earth _are_ being accelerated _outward_ at 9.8 m/s², and, b.) there are a plethora of presentations, videos, and lectures out there on the interweb (scienceclics presentation on general relativity is particularly helpful) that support this fact, including even one that Sabine herself released recently, and, c.) you don't have to take my word for it, you can determine this for yourself in just a few minutes. Here's how: • Climb up on your roof and attach some bathroom scales to your feet with gaffa tape. Look down and observe your weight. Now jump off the roof and you shall see that you weigh... nothing whatsoever. This might seem strange if you are being pulled down by the "force" of gravity - for if you weigh nothing at all, then what mass is gravity acting on? • Take any two objects you like and (accounting for air resistance) observe how they _always_ seem to fall at _exactly_ the same rate. This was an effect that was never satisfactorily explained within Newton's account of gravity, but when you realise that when any two objects are let go they _do not move anywhere in space,_ and that it is the surface of the Earth itself that rushes up to meet them at 9.8 m/s², then it becomes immediately apparent why this effect occurs. Because _of course_ any two objects will _appear_ to fall at the same rate _if they aren't actually moving at all!_ • Ok, I can sense that you still aren't fully convinced yet, maybe, but this one's a clincher! Your mobile phone has an accelerometer built into it, so simply download an accelerometer app and look at the reading! It will tell you, and I say this with all confidence, that you are currently being _accelerated_ at the rate of 9.8 m/s² (give or take a hundredth of a decimal place or two)! And you can't be going _down,_ can you? Then, drop your phone from a small height onto your bed and look at the _lowest_ reading; it will tell you that it was then travelling at 0.0 m/s, and your phone, as you saw it "fall", went absolutely _nowhere!_ And so... viola! Now you know! Look, in all seriousness, I don't blame you for doubting the bona fides of some nutcase in the YT comments section, I mean, there _are_ just a few of that sort around! But I don't make statements here unless I'm absolutely sure about what I'm saying, and I can tell you now that I'd put my very life on this, altogether incredible fact being, in fact, altogether credible! It _does_ require a small slap to the face from our old friend cognitive dissonance, and a little bit of quiet reflection, before it all sinks in, though. I must have spent about four months researching the issue before I came to finally believe that what I had heard was actually, really, factually, true. That said though, I didn't think of using my phone to test it back then, so... and I must have spent about nine months trying to convince my partner that this was her reality. She eventually said that she "got it", but I have my doubts about whether she ever truly did! I admit, it's nuts, it's totally nuts, but when _you_ get it, you'll never look at the world in the same way again! Post Script: there _is_ an effect of gravity near the surface of the Earth, but it's minute. It would take, for example, around three hours for a dropped pencil to reach the ground if only the "force" of gravity were acting on it. Afterword: if you think about it, gravity _can't_ be holding us down. The ISS orbits us at around 100 kilometres above the Earth, and the gravity at that height is something like 97% that it is here on Earth's surface. So, if it _was_ gravity that was acting on us, then how could the astronauts just float there... almost as if they _weren't moving anywhere in space?_ Physics is a trip! Have a great, accelerated, day!
2:16 This is true for Turing-complete systems like classical and quantum computers, but what about for systems capable of hypercomputation like real computers? That would (in theory) allow for computation with infinite precision, removing the need for discretization. Hypercomputation doesn't seem to be physically possible in our universe, but if we're talking about the simulation hypothesis anyways, then what if it's possible in the physics of the simulator's universe?
We've had the concept of infinite precision with calculus for about 400 years. Binary arithmetic, and graphing, to my knowledge does not yield infinite precision; Analog computing and its numerical methods would come closer.
@@johnrieley1404 Something struck me as very wrong about your claims about calculus. And I think it is this: Calculus is not designed to produce infinite precision. It’s designed to produce functions that allow arbitrary precision. No matter what function you end up with, it still must be supplied by the user with infinitely-precise real numbers-which of course are practically impossible absent hyper-computation, as OP stated.
As someone who likes graph theory, I see this as an absolute win. As someone who had to look up the ideal gas law, i have no idea what you're talking about.
4:42 there's a way to fix this problem with computation. In reality computation takes time, that creates basically a 1:1 relationship between the speed of information being computed and the size of the objects. Aka, the faster you go, the more cycles you take from the Universe's CPU, the thinner you are going to appear to another observer, space itself becomes distorted because you have different computations happening at different speeds, and the length of something is defined to the time of computation because that's how you calculate how long something is, the amount of time light takes to go from one point to another, that means the amount of cycles it takes, but the amount of cycles remaining to the computation changes. Obviously light always go at the maximum speed, the clock rate of the computer, but the particles that are not mere simple information take time to be changed, thus they will appear smaller. "c" is mere the cycle clock of the Universe, so when you go fast, you basically consume all the cycles, and if you go at the speed of light "c", you basically stop moving and become so thin you disappear, that also solves the problem with singularities.
I don't understand why Wolfram's approach is considered so non-standard that physicists ignore him. Isn't this the exact same thing string theorists have done for 30 years? Neat ideas, zero backing evidence. Making a theory first, going looking for evidence after? I know it's ass backwards as science goes, but if string theorists can do it, why can't Wolfram?
Maybe, because he is independent (social, intellectual and financial) and therefore able to play its own game - not theirs. So he is an outsider by definition. But I think he would be not the first outsider to bring science forward (see T.S. Kuhn: The Structure of Scientific Revolutions)
@@maclypse lol, String Theory is much closer to physics than Wolfram’s model (at least currently-and for the foreseeable future). Whenever I read of someone bashing String Theory I know they are just haven’t really studied physics.
Physicists have relied on "theory first" work done by mathematicians for centuries. How many advancements in physics were facilitated by some abstract math, developed decades or centuries prior, wherein the physical is a special case? Wolfram's work sits somewhere between pure math and theoretical physics. He and his team are working to build a framework wherein the patterns of established physics emerge naturally but should also provide some additional predictions. They are still in early stages. AND, unlike string theory, he is funding it himself. He is not wasting everyone else's time with promises of experimental verification right around the corner or, conversely, never.
Wolfram. Great name. I've heard him talk and in my opinion he's intoxicated by the exuberance of his own intellectual verbosity. He's a business man and has to sell his ideas no matter if they are true or not
Since the universe is so old, it presumably runs on a mixture of FORTRAN and COBOL with a bit of assembler for the singularities. On a serious note, I'm concerned that computer scientists seem to be trying to fit reality into computing rather than vice versa. String Theory seems to have uncomfortable echoes of how hard drives work, for instance. The fact we can make a discrete model of something does not mean the something is discrete; ask Ptolemaeus, Copernicus and then Kepler.
"It could actually work" Isn't this the nicest thing ever Sabine ever said about a new theory?!? Damn! The dude must be stellar!
I can only assume she is beeing held hostage by Wolfram and forced to say these lines.
Nice pun. Intended or no?
@@ALTruckerDad non native english speaker here. had to ask chatgpt what pun you meant.
We need to make a distinction between theory and framework. Wolfram is creating a new framework, not a new theory. But, you can express different theories using a framework.
Hes a genuinely smart guy i listened to one of his talks
I think Wolfram's line of reasoning makes sense, in the sense that he's looking for a pattern from which the laws of physics are an emergent behaviour
Physicist are working on this. It's called "constructor theory."
There are simple rules that describe time, space, and objects which must follow rules, aka, constructors. The system then evolves according to the constructor rules.
@@joeboxter3635 Isnt this, like, regular physics with particles ("constructors") that follow laws of nature ("constructor rules")?
@@joeboxter3635regular particle physics already uses constructor math a lot. It's nothing new
@@Tarnbar I was just thinking that. It sounds like a different way of describing the same thing.
This is what physicists do at least since beginning of 20 century. Unification of forces is the same under the hood.
Sabine didn't mention something that I thought was most curious about Wolfram and Gorard's work, which was in the way they worked with hypergraph rewriting methods.
Firstly, by choosing hypergraphs, they were allowing all possible topologies, and secondly, rather than choosing some specific set of rewriting rules that they thought might work, they chose instead, to integrate across all possible hypergraph rewriting rules.
So, they're not imposing any topological structure and they're not imposing any structure to change over time, and yet structure emerges nevertheless, which is astounding.
Many rewrite rules are computationally equivalent - so they reduce to their simplest form for predictive purposes.
Many rewrite rules produce no structure at all, and so they can be ignored as background noise.
Many rewrite rules produce only momentary structure that self destructs - think like virtual particle pairs that emerge and self-cancel.
Many rewrite rules produce structure that is computationally reducible - and hence the kind of structure we focus on in macroscopic physics where we can predict the outcome, because it can be computed faster than the system that actually enacts it.
Many rewrite rules produce structure that is computationally irreducible - like we see in quantum physics, where there is a probabilistic distribution of potential outcomes the could be determined through Feynman's sum over path integral approach, but which could never be computed faster than the system itself operates.
Essentially though, the structure we observe, is the recurring patterns that emerge from the rewriting rules that actually do produce recurring structure, while everything else naturally falls away.
Absolutly. i made a comment here on this as well, pretty much (studied W-Model for 4 years, seems like you have as well). This video basically reads "lets skip mass energy equivalence and cut straight to n = 4 super yang mills" esque reporting on the W-Model, which is the same mistake pretty much every communicator makes on the project.
Comp equivelence, Comp irriducibility, the Ruliad...these are super important bedrock ideas that underly the physics model, which has the nature in which you are talking about here in your OP.
Sabine starts talking about Causal Sets...which is like a very small portion of the W-Models way of (rigoursly) formalizing the topology to a continuous spacetime manifold...and this conversation happened 4 years ago (Look up Wolfram Physics Project : A discussion with Fay Dowker)
The most important aspect of the W-model is the idea of convergent physics that you are highlighing in the OP...that once we say there is this causal set of relations (causal graphs) and finite observers embedded in this graph that try to understand what's happening, then what follows is quantum mechanics, relativity and statistical mechanics. That these three bodies of physics (and the hypergraph) follow inevitably as a product of just existing in this Ruliad object. In other words, in a strcuture that is "running all rules" there is convergence to the same laws of physics = therefor no need to "find the one rule" which was a weird proposition anyway.
i don't know why science communication is so hard. But anyway if Sabine is supporting Wolfram, i'm down for that i just hope she gets the bigger understanding eventually so she can do the proper justice.
^^ Agreed! In my own metaphysical musings, I've kept coming back to the idea that topology is fundamental. Abstract rewriting basically covers how one thing can become / or be causally related to another thing. The topology of a set of things is like its structure, something observable. And how the topology changes according to rewrite rules is akin to how we observe a dynamic universe where these observable phenomena change over time. I believe taking a unbiased approach of summing over all possible rewrite rules can be compatible with the notions of relativity and entropy/randomness. It seems to make sense that the universe is trying all possible transitions from one configuration to the next, but that some transitions negate each other, leading to symmetries of the universe, such as the principle of stationary action.
It's like within the space of all possibilities of the universe, there is a manifold of possibilities that follow these confluent laws of nature. And a relativistic frame of reference is like a projection of that higher-dimensional manifold onto a 1D submanifold, being that observers' local perspective of causality. And finally the random nature of the sum of rewriting rules leads to increasing entropy and the perceived direction of that local causal submanifold.
@@NightmareCourtPictures if this works, would it explain the "fine tuning"? If you start from nothing and physics still falls out, then that implies that this is the only way things could have been, right?
@@samuelwaller4924 Yep that's right. I believe fine tuning is well explained under this model. There's no need for Multiverses or carefully constructed initial conditions.
It's refreshing to see that Wolfram actually listened to the flaw in his theory and found someone to help address them. Good to see that Gorard used the work from the causal sets people instead of reinventing it. Even if the theory/direction is shown to just not work, at least we would know what to not work on, which would already make it better than string "theory".
I'd wonder if the progress in analog (computer) chips that would have been made or published in about 2022 bring any comprehension advantage, but i'm not an expert at that "topic".
Refreshing? Are things so bad in science that its no longer standard practice? Isn't it still the point of peer reviews to find flaws for the author to address or alternatively to confirm or supplement the work in case its found to be valid?
@@1112viggo According to other of Sabine's vids, the point of peer reviews is to have professors and researchers check each other's work for free so that the publishers can make money.
The work is required by university professors so it becomes more of a quota to meet than to progress scientific knowledge and therefore is just busywork.
Peer reviewing does find errors, but most published articles are just rehashing of the same discoveries so there isn't any mistakes to begin with because of replicated work.
@@user-fk8zw5js2p lol right, i saw that episode as well. But id like to think that the original intention of peer reviews still servers its purpose, albeit with a modern twist of exploitative bureaucracy😅
Their current status seems to be that they can explain superposition and decoherence in quantum physics, and they can derive general and special relativity from the starting point of hypergraphs. This is already amazing progress, as it hints at unifying the two theories. The big unsolved issue ATM is accounting for particles. At the level of the hypergraph, particles are probably vast self-repeating patterns in the graph, but it's been difficult to work out how they work, so we're very far from deriving the standard model.
This is high praise coming from Sabine. Didn't even get interrupted by Elon on the phone.
So maybe a fake? There is always a call.
@@steffenbendel6031 NO! the last call is long ago, it was the running gag of last year.
He's busy right now 😂
@@PhilFogle yes!
@@steffenbendel6031 maybe in the future, the calls will come from Wolfram.
Interesting take. I would also suggest the conversation between Stephen Wolfram and Donald Hoffman. I loved it.
"Wolfram is a curious case because he's not your average crank." - in his teenage years at Caltech, he was publishing research papers in theoretical elementary particle physics that were world class. He was prepped to become the next great theorist and then he changed directions. He's not really a mathematician, he's always been a theoretical physicist at heart and his mathematical approach was a way to get at the patterns deep in nature. It has been a many decades long project and even though I don't think it will be able to derive physics from these supposedly deeper principles, it is an astoundingly original and interesting effort - from a genius.
Physicist here. I had dinner with Wolfram recently and he is definitely not a crank. He has deep curiosity to learn all he can about areas of physics that are relevant to his program but are new to him. He is humble in style yet confident that he is on to something. We need people like him who have original ideas and the skil, time and money to pursue them (He became wealthy from his invention of Mathematica - a software program that is now used by thousands if not millions of scientists every day).
@@squeakeththewheel When I was in high school preparing to become a physics major in college, Wolfram won the MacArthur prize and a small article in the paper about it brought him into my consciousness. My plan was that I would be his student at Caltech - I couldn't wait. Unfortunately, he moved on from Caltech before I graduated from high school.
give me the names of the "world class" papers he published as a teenager. I've seen this claim repeated everywhere yet no one seems to know what those supposed papers are
@@sereysothe.a I searched Google Scholar and came up with nothing.
@@sereysothe.a By 1981, he had published 25 papers in theoretical and computational physics including at age 18 in 1977 a paper on heavy quark production in QCD
1978 - Discovered early connections between cosmology and particle physics
1978 - Fox-Wolfram variables for analysis of event shapes in particle physics
1979 - Discovered Politzer-Wolfram upper bound on masses of quarks in the Standard Model
1979-1982 he developed computational QCD approach to simulation of particle events.
I'm obsessed with this Channel. Not only it's content but also the presence of Sabine Hossenfelder. She gives so much structure to physics. Thank you from Portugal.
In fact whenever I'm reading a paper in math or physics I imagine it being read by Sabine. :)
We should definitely send some pastéis de nata to Sabine
And Mahalo from Hawaii.
Check out her earlier psychedelic music videos... Those were the days. 🎉
Mad female scientists.... They do exist!!!
Thank you, for this video, and especially for giving Jonathan Gorard credit for his part in this work, he certainly deserves the attention.
Jonathan does great technical lectures on the theory on youtube. Most definitely worth the watch for people interested.
"The story so far:
In the beginning the Universe was created.
This has made a lot of people very angry and been widely regarded as a bad move." - Douglas Adams
Brilliant, concise. The phrase theory of everything, is in the room of hubris.
"It's a bold strategy, Cotton. Let's see how it plays out."
The big sneeze
Waiting for someone to figure it all out so it can be immediately replaced with something even more bizarre (but might make more sense to us ape descendants...)
Universe was always here.
I did stats classes in college and there were two professors I really enjoyed. They had different approaches to fitting a model. One was very fond of running monte carlo simulations and working out a model from the observed data. The other would like to work out the math for the model they theorized it would be, then tested it afterwards. A machine gun vs a sniper rifle. Either approach would probably work, and it was more appropriate to use one method over the other with certain problems. I feel like Wolfram is simply taking a different approach from a 'traditional' physicist because the bulk of his life's work wasn't in traditional physics. Perhaps physicists had a hammer in their hand and saw the problem as a nail, and Wolfram has a screwdriver in his hand and sees the problem as a screw. Time will tell which approach bears more fruit.
i want to agree with you but Wolfram is the one with only one tool.
I asked under another comment but I will ask here also since I see so many links in people comments:
How did you add it? Why did your comment did not get auto deleted?
Technically, with Monte Carlo
simulations you’re not running the simulation first and then figuring out the model afterward from the data that was produced. Rather, you have to decide on the model _beforehand_ (i.e., the probability distributions involved, their parameters, and the logic and math of how their samples interact), and then you produce simulated data from that pre-defined model.
@@anonanon6596People don't add those links. It's a new "feature" of UA-cam which automatically adds those to keywords fitting to other UA-camr videos.
@@anonanon6596 i don't think HE added it, probably new yt functionality, like context cards.
As a Mathematica user since the early 90s, nothing really ever surprises me about Stephen Wolfram's ideas and pursuits.
I also love Jonathans attitude towards it. Even if the computational approach doesnt quite work it may still produce mathematics that does reveal new ideas, for example the fact that curvature can also be a measure of non-integer Hausdorf dimensionality. Stuff like that. The way they are incorporating quantum mechanical in their models might actually help teach us a lot! More to them
@@DrummerRF … but the problem Wolfram has is that his model is so radical that it doesn’t really square with any extent models in use. So to date he hasn’t even come close to providing anything even as simple (in conventional frameworks) as the g-factor calculation. In fact he’s made it clear that that would be a considerably lofty goal.
One of the mistakes I see critics of Wolfram's approach make is assuming that the computation steps correspond to increments of time. They don't, and this was made clear even in Wolfram's 2002 book (A New Kind of Science). The other mistake is failing to understand that observations from "outside" the hypergraph are not possible; the observer is itself a substructure of the hypergraph, and this affects what can and cannot be observed.
Exactly this. Slice thru the hypergraph at one angle, and the links you cut define space. Slice thru it at a different angle, and the links you cut define time. Rotate these slices, and you get space and time switching just like relativity says.
I'm not sure why it's based as "incompatible with GR" since the best and most reliably confirmed theory we have (QFT) is also incompatible with GR. We wouldn't be looking for "a theory of everything" if GR and QFT were already compatible with each other.
This comment is identical to www.youtube.com/@RaeneeCarver-i3x. WTH?
"Wolfram's approach make is assuming that the computation steps correspond to increments of time."
Not neccesary. issue is that people apply time because of the way computers are dependent on clock cycles. Think of it requiring a transform (ie Z/Fourier Transform: Time Domain to Frequency Domain) to change it from the digital compute domain to the real world domain. We don't have the means to compute the real world in real time with digital computers.
Exactly and even if it was increments of time it would appear (and behave) continuous to any observer "inside" the hypergraph
@@antoniopannuti2088 It would be fairer to say it's the other way around, since this is the older comment. @RaeneeCarver-i3x's comment is identical to this one.
I appreciate how Wolfram's approach tries to uncover underlying patterns to explain complex phenomena. It's compelling to consider that what we observe as the laws of physics might just be the surface of a deeper, systematic structure that governs everything.
As i understand it, and imho this is the main reason physicists should have a look at his work, part of Wolfram's genius in his approach is to remove the observer from the analysis and to study the hypergraph as a whole. This allows to sidestep our observer biais when considering motion and other stuff (Since as observers we are part of the graph). His thesis is that the laws of physics are emerging properties of the hypergraph AND that we only can experience them as observers from within the graph. So we can experience the emerging effects, but can't experience the underlying principles by nature.
That's a level of abstraction i initialy didn't think we would need but it makes sense given how complex systems behave.
Impossible to remove the observer.
@@johnrieley1404 Only in practice
I am a big fan of his observer theory as well. It is very intuitive given that all observers are made of topological solitons of spacetime and are intrinsically connected to the universe around them.
Damn. Does that perchance solve the measurement problem? A measurement is an interaction between two different parts of the hyper graph?
Why removing the observer is a positive action? Observer is and always was part of the system. It is not just a reason to annoy physicists, but a part of the system in consideration.
I worked as a freelance technical writer for Mathematica in 1990 and have a lot of respect for Wolfram.
I wonder the things which can be computable once something meaningful like (don't laugh) Mathematica for Quantum ship. I am not interested in how it can crack RSA. I am curious about the incomputable things. Some are time related.
I was a student at the University of Illinois getting a degree in Engineering Physics and took Wolframs, class on Mathematica. We ran it on Sun Microsystem computers and were essentially helping to debug the first release. I used it to plot electron orbitals in 3-dimension and loved my time in that class.
Many of the people who worked for him were also brilliant. Theodore Grey ran that class IIRC and has done some really interesting stuff himself.
@@alansnyder8448 I love all of Theo Grey's books that I've read. He's a fantastic science communicator.
@@Olgasys Quantum computability is equivalent to classical computability though. You mean problems that are efficiently solvable by quantum computers?
@@quintium1 Absolutely. A lot of things are labelled impossible to calculate since they would take millions of years. There is another thing like "If you can perfectly predict next month's weather but it takes two months, it is pointless"
Please do a long form talk with him ❤
He's very hard to interrupt, but I think Sabine would succeed where others have failed.
Hi Sabine - it would be very interesting to see a video discussion between you, Jonathan Gorard and Stephen Wolfram.
Absolutely. Would pay good money to see this panel/podcast.
Please focus on Jonathan Gorard. He seems to have a clearer understanding on the goals and limitations of the theory.
Epic Rap Battle ?
The most concise abstract ever at 5:37
She was and is remarkable unique.
Abstract answers are my favorites
Why? Because she can.
@@srr1463 the grand truth of life
I've been fascinated with Wolfram since the release of his much-lambasted A New Kind of Science decades ago, and it's only your videos and discussions of the academic world that have helped me understand his situation. Wolfram is famous for his ego and his radical pronouncements ("A new kind of science!!!1"), but I realize now, what choice does he even have to be heard in a world full of people with their fingers in their ears? His work challenges the academic consensus, and so everyone just laughs at him. I've got the utmost respect for him continuing this work for so long now despite the hostilities. The world needs more original thought.
Over 13 years ago I read Steven Wolfram’s book A New Kind of Science. The book blew me away, from my fascination with Mandelbrot’s equations and the Principle of computational equivalence to simulate systems with emergence on a Turing machine.
I actually bought that book and still have it.
I studied the model for 4 years (since 2020). For those that know, have seen me around, recommending to look into it and explaining Wolfram's work formally. I'm gonna explain way more clearly what the W-model is.
Wolfram's work in New Kind of Science was running computational experiments, specifically proofs of exhaustion (running entire rule classes) to show the three following facts :
1) That Rules can do arbitrarily complicated behavior. Specifically, behavior NOT DESCRIBABLE by mathematics.
2) That Rules could be generalized as falling into 4 classes of behavior : homogenous, patterned, random and complex.
3) That Rules could emulate each others behavior, like showing how rule 22 can emulate rule 90 with a different initial condition.
The 3rd observation from those experiments is the most important one, because in the latter half of the book, he shows that it is this ability to emulate the behavious of other systems, so that he could make a proof for his principle of computational equivalence : That all rules are in some sense sit in the same rule space. He would then go big brain and show that the size of that rule space is equivalent to that of a turing universal machine. So, by proving the Universality of Rule 110, and showing many examples of how these CA's emulate each other then the idea was that you could string together rule emulations to emulate rule 110 to then emulate a turing universal machine.
The reason the principle of computational equivalence is important is because it is an equivalence statement for ALL systems. That any system that is following rules, are equivelent to each other, by being classified as turing universal machines. This is also what separates it from other similiar ideas in the space like Church Turing Thesis, which says all systems can emulate a turing machine...Principle of Computational Equivelence is a statement that all systems ARE turing machines. Clear difference.
Following the principle of Computational equivalence, is the formal argument of computational irreducibility : That because all systems can be considered equivelent to turing universal machines, then every system is computationally irreducible : Finding out what they do is equivalent to trying to solve the halting problem. Again this is a clearly novel and distinct statement being made...a statement that is stronger than super determinism.
The above leads to the construction of Wolframs Ruliad : All systems "sit" inside this single complexity space... are equivalent to each other, to this space of a turing universal machine. This equivelence is a formal : Graph isomorphism, and if you are a system embedded in this graph, you must preserve the structure. For example if i'm sitting in a corner of a room looking at an apple and you are sitting in another corner of the room seeing the same apple, the room is still the same room, we just see the apple from different perspectives, and therefor see different things...but it is still the same underlying structure. Generalize this to space time, where one reference frame, and another can see different perspectives of space-time events...but it is still the same underlying structure hence relativity.
The Ruliad is the mathematical object of this fully computed turing machine...and that THIS is the object we are preserving group transformations to. That space, is far more complex, and "bigger" if you will, because it contains all possible events... future, past, everything...so in other words time doesn't exist, we just perceive time, with respect to this ruliad object. Consequently the entire model comes from this aspect alone : That the Ruliad eternally exists, and our perception of the Ruliad is what gives us spacetime, QM and statistical mechanics : as limits of our perceptions of this infinite object as finite systems ourselves. This perception of how we perceive that object, is the W-model.
There's so much more to this model, and if you take enough time, you'll realize it is beutiful. Saying that systems are turing machines comes with alot of elegant consequences, such as the fact that systems can be thought of as computers in the same way as we use computers now...that the universe is an ocean of unlimited potential and that we are "sampling" it to get what we need, for things that are useful to us...like a kind of programing.
If interested in his work, watch videos in this order :
New Kind of Science Series (1 - 16)
How universal is the concept of numbers
Can AI solve Science
Observer Theory
Stephen Wolfram Readings: What’s Really Going On in Machine Learning? Some Minimal Models
Also read The Concept of the Ruliad on his blog.
@@NightmareCourtPictures Man, I find Wolfram to be worthwhile and interesting and I was also an early interested party in his physics model, but you really want to take anything he claims with a big pinch of salt. He CONSTANTLY restates old principles and sometimes seems to just rename them and claim them as his own…
The “Principle of Computational Equivalence” (or whatever he’s calling it) is a great example of this behavior, it seems to me. Turing and Church’s ideas for general computation seemed enough-what is Wolfram actually adding here!?
Actually, his whole book ANKS was interesting, but again, it’s not any different than the field of non-linear analysis, which preceded him by DECADES.
I wish Wolfram wouldn’t do this. As I said, I find his talks and writing interesting enough on its own merit, and I don’t see any value he gives himself by such grandiose claims.
This is all just my opinion-I could be wrong. But I have a strong suspicion that I’m not in Wolfram’s case.
Thank you. It's nice to have a starting place.
If rules are the foundational building blocks, and by rules I assume we mean anything that can be described / axiomaticly stated, how does this approach avoid the limits of describing reality as discovered through Gödel's theorem (and derivatives) which basically is interpreted as that there are things obviously real and true - part of reality - that can never be computationally proven - deducted from an axiomatic approach? I expect this is considered by thus W, but neither this video or your explanation seems to explain it, except mentioning the halting problem in passing and not saying much about it.
Ps. Was quite some time ago I left my uni comp sci math classes behind.
@@chriscurry2496 I explained the difference between the Church Turing Thesis and Computational Equivelence: Saying that all functions (which we can think of as an apple) can be calculated on a Turing machine, is not the same as saying that all systems, including apples, are Turing machines. It was not at all obvious or apparent that the universe was defined as being like a computer program, where systems like apples, could be defined as programs... not until recent times (in the age of Wolfram).
Other people around the time of Wolfram, had ideas about the universe as being like a computer, but in general they all miss the mark : trying to use mathematics to describe it...which leads to the point made below:
NKS is not like non-linear analysis. One of the big points of the book was to show, that mathematics like statistical analysis are not robust enough to describe systems like the cellular automata...hence the first observation : "Many rules were not describable by mathematics."
"Rule 30 does what rule 30 does" is the only way to properly describe Rule 30 and what it does.
NKS covers this, and so to does the lecture that should follow watching that series "how universal is the concept of numbers" where he focuses on how numbers fail to describe systems, but it is ultimately this feature "of counting things" that form physics as we perceive them. Equations are constructs made of limitations of our perception of this infinite uncomputable thing that is the universe...again that is novel.
@@NightmareCourtPicturesMan, I’m sorry a lot of things you say here are just flat-out false.
The points about Turing/Church are not even correct. Read what Turing thought, for instance: he was very much considering things in the universe to be computational.
Your point about cellular autonoma
“not being representable by mathematics” is just ignorant. I’m sorry that I am so blunt; but that’s the truth. I mean cellular autonoma ARE mathematical (see Pino Arithmetic for a simple example)! The entire theory of computation (under which CA’s fall) is mathematical!
Whenever you see a program, that is equivalent to a theory in mathematics-except that it’s somewhat limited. In fact, the field of mathematics includes constructs which are MUCH more powerful than those in computational theory (see the real numbers, for instance)
Great explanation of how computational, incremental theories like Wolfram's are, like quantum mechanics, incompatible with general relativity. Reminds me of a notion I had: that Xeno's paradox can be explained by the idea that nature is discontinuous, made up of discrete elements of space and time.
She recently made that video about the quantum Zeno-effect, very effectful.
This sort of interdisciplinary work should be, with some fair scepticism, strongly encouraged. It may lead nowhere, but the process of an ideas guy joining dots from different specialisations so that others shoot them down, before they are then refined, is healthy scientific inquiry. It forms a sort of search tree and opens up new approaches. Dogma and gatekeeping, eye rolling about heterodoxy, shouldn't be the reason we dismiss these paths.
Gatekeeping is always a good thing.
In fact, I don't know how you missed that he's being gatekept here.
Gatekeeping is making him come towards something that makes more sense.
@@TessaBain Perhaps because that word can mean different things. Knowledge doesn't respect campus layouts. If people from one discipline have the institutional power to decide the future of ideas they aren't in a position to fully appreciate, that can be a problem. Some physicists make the effort to provide constructive criticism, but others, as Sabine suggested, should take Wolfram more seriously. Perhaps if they had sooner we'd be further along.
Or lead to development of new tools, that will be useful elsewhere.
@@marca9955 Perhaps, but we can't make as if we had an infinite amount of time and other ressources. One takes seriously what seems promising and needs support or help one can provide. One can be wrong, of course. But Wolfram doesn't really lack ressources and other physicists have a right not to find what he does interesting without wasting their time criticizing it.
You misunderstand at 2:45. His theory doesn’t have time only a sequence of events. Time is an emergent phenomena that is used to make approximate models, but likely makes it harder to understand the fundamental rules.
As Sabine put it here in one comment correctly, "there are two types of time, the one in which you calculate the hypergraph and the type that the graph represents"
Time = sequence of events though😂
@@javiersoto5223 no, you'd need to have an (emergent) total ordering on events from the model for it to have time that behaves 'like time'. that's a further restriction that his more general models don't necessarily need to follow
@@serbestianmilo1477🤦♂️
@@javiersoto5223in relativity, there is no universal clock
Sabine, I've admired your skepticism of junk science so much, and so I was pleased to see you're also encouraged by Wolfram. I think he's onto something, but you're right his work is so hard to understand. This was a great video. Please never stop doing them!
I’m curious, what’s hard to understand? He’s just saying that what we perceive as time and space are the effects of multithreaded computation constantly rewriting a hypergraph.
@@S.DaleMorrey The analogy sounds great, but can you follow his math? It's way harder than say, Einstein.
@@GeoffPlittWell yes you can follow the math. It’s Conways game of life on steroids. Trying to visualize emergence is where it becomes difficult, but we’ve all seen how simple computational rules produce complex emergent behavior. That behavior is what he means by computationally reducibility arising in otherwise irreducible computational spaces.
@@S.DaleMorrey If Sabine and I can barely follow his math, you sure as hell can't :)
@@GeoffPlitt Ad hominems begin where wisdom ends. As a computer programmer for over 30 years, I see this differently. The math is quite simple. If you're having trouble, it's likely because you need to learn about emergent behavior in complex configurations of otherwise elementary computations. Start by trying to learn how a simple calculator works at the mathematical level.
Then move onwards to how a computer works. Can you explain how a piece of silicon can produce the ability to sit here and have this discussion while simultaneously allowing us to view Sabine's video?
If you cannot do that math, then you're going to have a lot of trouble following Wolfram's computational model because what he's saying is that all our physical laws arise from computation of very simple rules.
I didn’t understand anything but I like to hear Sabine confidently talk about physics
"There is a theory which states that if ever anybody discovers exactly what the Universe is for and why it is here, it will instantly disappear and be replaced by something even more bizarre and inexplicable. There is another theory which states that this has already happened." - Douglas Adams
My theory is that it has happened more than once.
Used to be 'once' but now it's bizzarrer.
RIP Doug
Same
Sounds like one of those iterative processes.
My theory is that it happens every Thursday
This is a great explanation of Wolfram's theory, thanks Sabine! I've been trying reading it and understanding for quite some time, but your 12 min video definitely put it in the right perspective.
Wolfram recently did an entire series on his youtube channel where he goes through every chapter in his New Kind of Science book and just talks about the ideas in each chapter. It's a wonderful set of videos, and you get to hear the interpretation of the ideas in the chapters straight from the source.
What was the channel called? I searched Wolfram and found lots but nothing matching your description. Thanks in advance.
@@michaelstreeter3125
ua-cam.com/video/a2hD9Bwc0EU/v-deo.html&ab_channel=Wolfram
I think I saw the one on entropy and the second law. I was not convinced it works.
@@michaelstreeter3125 probably Wolfram @WolframResearch
@@michaelstreeter3125 UA-cam keeps deleting my comments for some reason. Search for What We've Learned from NKS to find the series.
Great summarized explanation of the story, thank you!
One nice thing about this is it brings a different approach to the questions than most scientists bring which will open up new clarity in explanation and most likely new learning for science too.
My dream is to see you discuss this directly with Wolfram in a podcast or something. I think it would be incredibly productive. Please please please pleaaaaaase :)
"If you don't know the answer, you're right"
Yes! Nailed it! 3:59
We are geniuses!
Actually there is an answer; the energy is what we create, light is made when we make our ATOMS interact so that they may then produce PROTONS. All things are vibratory and there is a right answer for everything.
ha, ha ; )
@@DivineAlchemyOfSouls now explain pentaquarks
@@lolerie well i got 5 quirks and each have a letter attached 😂😂
As an undergraduate, one of my lecturers at university was Fay Dowker, who was a student of Sorkin and works on Causal Set Theory. She’s an awesome lecturer and gave us an outline of the theory in one of her lectures. I remember Wolfram’s theory coming out around that time and thinking “this sounds a lot like Causal Sets…”. Interesting that he and has research team have taken the ideas of Causal Sets on board.
I'm super happy we finally got your opinion on Wolfram's theory of hypergraphs. I love opinions. 😊
Hi Dr. CoffeeBean🌺
@@Thomas-gk42 hi!
Ok, here's one of mine. Enjoy...
What if Dark Matter and Dark Energy are rivers, pockets and/or bubbles of slower or faster moving time?
We already know time can be manipulated. Imagine seeing an electron orbit however many times a minute. We set that as our base reading, right? For our plain of existence, anyway. In these pockets of Dark Energy or "Light Time," that same electron now has readings of a billion times a minute because time is "light" here and passes a billion times faster. Giving us readings of energy where there shouldn't be energy = dark energy = Light Time. Dark energy IS "Light Time." It's the difference in TIME that we're seeing repelling everything around it
Now we look at the other side of the coin. Dark Matter or "Heavy Time."
"Heavy Time," would be slower rivers, bubbles and/or pockets of time. That same base electron now moves a billion times slower and appears to stand still. "Starving of time," or slower pockets in time, creates a vacuum or attraction in space. Making it appear dark matter is attracting when really it's just a vacuum, IN TIME, trying to equalize.
If I'm right, a white hole's existence flashes by in an instant and a black hole's existence lasts forever.
nobody ASKED for your OPPINION
(XD JK) (sorry)
(I love how people get spicy in the comments section... about ... comments... 😁)
@@michaelccopelandsr7120 dude, makes sense!
@0:22 I thought Stephen Wolfram was a theoretical physicist who became a computer scientist later on when he developed Mathematica. According to Britannica, he got his PhD in theoretical physics at the age of 20 from CalTech.
I mean this isnt really relevant
@@lluvik2450 says you
@@jzay1899 lmao
@@lluvik2450I think it is because the topic is theoretical physics and Sabine is a theoretical physicist. When she says he is a computer scientist and a mathematician (which he has no degree in either by the way) it gives the listener to the video the impression that Stephen Wolfram is working on something that is not within his field of study and thus diminish his credibility. This is like saying Albert Einstein is giving a lecture to a bunch of biochemists about his theory of how life was formed on Earth.
@@paulah1639If this is relevant, than it's also relevant to know, that Dr. Sabine is a mathematician too, and not 'only' theoretical physicist.
I join Wolfram’s weekly streams as much as I can. I love that he engages with viewers and answers questions. He’s without a doubt one of the most intelligent individuals I have ever known.
Here is an honest question, "one of the most intelligent individuals I have ever known" does that say more about you or him?
@@katgod reading this comment from you dropped my IQ by a couple points.
Wolfram is one of the top mathematicians alive today. His mind is incredible. Why aren't people paying much more attention to him?
Are you aware that to be a mathematician you must create new mathematical results ? Your claim is like saying that the queen of England is the greatest marketer in human history.
It is 'nice' to see you talking about your area of expertise again.
One thing i like about Wolfram is his wealth of knowledge about the history of physics and maths and where physics has come from. Apart from having a PhD in Theoretical Physics from a young age, i'd argue he has a better understanding of the context and history of where ideas in physics come from than most physicists
IDK why this is (I have attention deficit disorder) but learning the 'historical framing' of discoveries is, like, key for me. It's like with the story of Eureka! and Archimedes - the story itself is the 'meme' in my gray matter that I can utilize in speech and thought. The discovery is the science at the inside of that meme. OR SOMETHING IM JUST SPECULATING. Wolfram's historical encyclopedic knowledge of physics has been tremendous, for me. And its all free. What a man.
Nah.. there's nothing worse than pure mathematician poking into physics! They always mess things up! Always made abstractions that had no ground in the real world.
Sure they can "polish" a theory.. but no pure mathematician ever came up with an original idea in physics.
Knowing history helps nothing here...
And, yes, before getting into a debate.. Wolfram is far from physicist no matter what he claims!
His whole body of work is nothing more than mathematics.
@@phobosmoon4643
I'm similar. 🙂 I hate learning stuff without a frame or context.
I find it odd that everyone describes Wolfram as a computer scientist only, ignoring that his doctorate is in theoretical physics.
Hes learned a lot more doing his work for the majority of his life than he learned in grad school! So if anything hes a computer scientist and buisnessman!
Yes!
yes, especially ironic given he got his PHD in Caltech with Feynman in his thesis committee. 😅
Your piece of paper does not define you: your doctorate in "theoretical physics" does not make you a "theoretical physicist".
@@srr1463 Just like an MD (Doctor Of Medicine degree) does not make one a doctor?...
Wolfram was the only one crazy enough to actually break through this calcified field. And Gorard is taking it to the next level. Gogogo! We are rooting for you!
Thanks for the wonderful video Sabine. I worked on Causal Sets for three years on black hole entropy. I worked on string theory for a year. In my personal view, causal sets is the simplest yet most elegant theory of quantum gravity. It is unfair how science fundings are not distributed equally among different approaches to quantum gravity. Many young physicists are forced to choose a different career trajectory because of this. I really appreciate you bringing causal sets in your platform for people to notice. It is through joint collaboration and sharing of ideas between different approaches how we can efficiently tackle this century long problem, not by claiming that 'our theory is the best and everything else sucks'. I think science community should appreciate ideas like 'causal sets' more.
I don't know much about quantum gravity, but I can comment that *in genera*l it does not make sense to distribute money equally among various hypothesis to solve a particular problem on any given field. The funding agencies, be that public or private, want to get return for their investment. More so in private, but certainly so also in public funding. This means there has to be scrutiny which studies gets funded, as otherwise money would simply be wasted on massive scale.
Science has always looked for consensus to determine what works as the absolute vast majority of ideas that would break fundamental aspects of consensus theories are not true and likely also rubbish. Science is conservative, but at the same time ready to change when evidence is produced against an existing theory. If you can convince someone to fund your study, great go for it, but don't expect society to fund you just because you had an idea.
In our society there seems to be this underlying current of thought that science and scientists are in some ivory tower looking down to people "who really know". All the scientists, at least the good ones, are very down to Earth people who really want to solve the problems we have in our society. We need to support them, not demonize them.
@@hubbeli1074But how do you know when you scrutinize whether an idea can cause a breakthrough in the long run or not? What we've seen is that the public funding basically puts all their bets on 1 thing which is determined by an informal popularity contest (public bias)
@@BboyKeny Do you have a study or facts other than your feeling that this is the case? My main comment was that "equitably" funding all ideas seems like a really bad idea. I want my tax money to be spend efficiently and as I am not an expert I can only assume the real experts making the funding decisions are doing the best they can given restrictions in amount of funding and e.g. political pressures to which they don't have any say on.
If there is an argument against this stance, then a properly articulated opinion with concrete examples of what should have been funded with expense of something else should be presented.
Dear Sabine, I very rarely leave comments, but in this case I have to... I loved this video!!! I respect and appreciate Stephen Wolfram's work immensely
I can appreciate the fact that somebody is willing to pursue their ambitions despite mass criticism. Double kudos when those ambitions aim to enhance my understanding of the universe.
Computer program doesn't need to show things in incremental way. Program can compute the whole state of universe that will be "shown" in next step and show it. So computation can still be done incremental way, but universe would never feel that, because universe only sees already calculated "state". That step might take very very long "time" to calculate in "place where that computer is working" but it can't be felt in our universe. That might even work in a way that in each step there is horrendous amount of loops that are converging things into the desired values with set precision (that I guess don't necessarily need to be set - but can be different for each step and each value because this precission might be estimated based on something).
Take electromagnetic radiation. It exists in waves, is analog. Currently, in the binary epoch, we use an analog to digital converter for processing and managing the wave. The binary product is a representation that mostly works, but it is not the phenomenon itself, which is forever lost in the conversion. So if we convert using a nearly infinite number of binary steps, the phenomenon being converted is lost. Say we take a grand microscope and look at the elementary components of the model, we would see binary steps, not the analog radiation. The model and the event are substantially different in implication, to a rather large degree that cannot be overcome through more stepping. To me, a continuous phenomenon should be modeled using a continuous system.
If this were the case, would it not be proof that there is something outside of our universe?
@@damianlukowski9996you could never prove it. if the characters in a video game can never see the code, as they’re trapped in the simulation they can never prove they’re in a simulation.
Unless the game happened to be designed so as to allow them to uncover the code
@@Humungojerry Yeah this just is not a sensible conclusion. It may or may not be the case that digital dwellers "in the game" unmask the code which implements them. It's rather analogous to actual physicists unmasking the laws of physics--whether or not we are ultimately successful depends on many things, such as how accessible the laws are, how complex / simple they are, etc.
In the case of digital physics, I think the proper analogy would be a virtual machine. It's known that some exploits allow for guests to discover and break out of their virtual machine container, and thereby start interacting with the host system.
In short, it depends. It may or may not be the case that such creatures could discover their implementation.
Google places Einstein's height at 170 cm. Check out the dude at 2:15. He must be an alien and that would explain Albert's insights into physics...
People were shorter back in the day
Thanks to you I just discovered wolfram language! AMAZING! I am a C programmer and seeing what this language can do amazes me so much!!!😮🤯
Hmm, his space as a network theory does have a certain charm to it. GR says that space cant have an absolute reference frame/coordinates, but a network doesnt have one either. Its all "relative" by definition and so kind of automatically aligns with relativity.
If thats the case then technically quantum effects happen on the node scale and gravity is simply the arrangement of the nodes together ie their connections.
Very intriguing. Its less that reality is a simulation but rather computation itself is an exercise in emergent behavior and properties. And so if we use what we have learned from studying computation we can see that the network data structure might be a nice fit as the simplest organization of reality that can still lead to the emergent properties we call physics.
Its worth investigating and its a better reason to do it than the traditional "this would make the math beautiful/elegant" that weve been using for 50yrs.
Didn't Godel prove there is no such algorithm that can explain all of mathematics. Thus making a theory of everything a null argument. Wouldn't a "theory of everything" be everything?
@@techsuvara God, it's been a hot minute since I studied that, but I believe Godel's theorems are about completeness/incompleteness. Basically any system of logic (math included in this) can either choose between "Being able to prove everything" or "Making sure everything you prove is actually true". What he proved was that if you have one, you can't have the other. If you can prove everything, it means every true fact should be able to be proven, but it also means you can technically using that same system of rules/logic, prove things that aren't true. And on the other side, if your system only proves true things, ie. every single thing you prove to be true, is in fact actually true. Then that means there will be some true things, that your system, no matter how you manipulate the rules and logic, will never be able to formally prove.
Put another way if your circle is big enough to have all the true things inside of it, some false things will be also in there. And if you shrink the circle so that all the false things will be gone, then some of the true things will also be outside the circle. Ie. you can't have a circle that has all the true things inside, and false things outside. (This is a gross oversimplication but the general idea holds).
I do not believe it actually says anything about "computation" itself. It shouldn't effect any computation that describes all of reality. I think the confusion may be confusing the "theory of everything" with the idea of "every single theory". Godel's theorems talk about "every single theory" but not the theory of everything.
There is a weird subtlety to these ideas. ie. what does it mean to "prove" something. But some of it is more semantics.
@@aggies11 a simply way to put it, is that any sound formal system has limits on what it can express. Any formal grammar (such as a language grammar), is also a formal system.
From my research, it's related to recursive and self referencing relationships.
One interesting thing is, quantum mechanics are expressibly in formal systems too...
My intuition is, no computational model can describe our reality. This is because reality exists outside of what we observe externally. That's just me as a panentheist.
@@techsuvara Yeah I think that's basically what I remember. But that's about whether something can be "proved" not necessarily expressed, right? And to compute something you just need to be able to express it? So there are math statements that we can express, but can't actually prove that they are true or not. So as long as we can express the math of a possible "theory of everything", even if we can't prove it's "trueness", it's possible to still compute it?
I do remember the halting problem, but I don't think that's related. Undecidability I think it was called? I think that's separate from incompleteness.
@@aggies11 the halting problem is related in that there exist proofs on the limits of formal systems, Turing machines, and formal grammar.
I've had difficulties with both Wolfram (since the 2002 book) and Wegmark (I view his "Mathematical Universe" as a close analog to automata theory) in their inductive approach to the logic of theory building. It seems to me that if you: 1.establish your first principles in a way that has intentions of producing a known result; 2. continually tweak the model when the iterations fail to produce that result; 3. stop iterations once the desired results are obtained, then all you seem to do is reproduce the previously known results with your "new" language. Thus, even if these inductive processes lead to the answers the researchers are looking for, they have not created anything explanatory or "new."
What if after reproducing the known results, their final models generate new, testable predictions?
It will still be new *if* the theory in the end can make testable predictions that previous theories can't explain. That remains to be seen.
Well, that’s exactly what the last Physics Nobel prize was given to.
Actually, this is how a large part of research mathematics proceeds. The results of this method have certainly proved valuable in very many cases -- very many of the modern definitions of mathematics, for example. Furthermore, if the new language is simpler than the old one, I believe the "rephrasing" can indeed claim to have explanatory power; even if it is not, a new perspective can often prove useful.
It has all one simple fundamental philosophical assumption at it's that that if we get a model explaining everything we observe we magically start to know he nature of everything in entirety. It's like having the polynomial fitting a number of don't of a non analytic function and thinking that we now have a formula describing the functions completely. Wrong wrong. It will be wrong the moment we leave the realm of already known observations. Such theories should be measured on technical applicability and predictions they provide. We have already tools that provide us with a quite good description of what we observe.
Kinda underrated channel, keep up the work man👍
Yes!
"If you don't know the answer, then you're right"
I'm right so many times while watching your videos! 😎
Going to school was a mistake
@@damianlukowski9996 The people who don't go to school are the ones who don't know the answer and think they're right anyway.
Computation is only incremental if it's digital in some way... Analog computers have no problem doing continuous computation.
Also, QM and GR are already known to be fundamentally incompatible, so we already know that there's something wrong with one or the other.
Or perhaps there is a presently unknown bridge theory between the two that requires another Einstein. The problem not being that either is wrong, but that a piece of the puzzle is missing entirely at this time.
They aren't actually continuous. The bits are just chopped up really small.
Analog is really fundamentally discreet- you still have bandwidth limitations. What we see as analog is just very zoomed out discreet transitions, both in frequency, bandwidth, voltage, etc.
@@BenWard29Idk about that chief
But if it is not incremental, computation is not distinguishable from the mathematics underlying normal physics laws.
I really like Sabine explanation. I would ask him what result can his theory can predict that aren't already covered by the other fields or yield different results.
What if we stopped looking for the Truth about the universe and started looking for useful frameworks to predict specific things that happen in the universe? If we did that, maybe efforts like Wolfram's wouldn't seem so threatening to career physicists. Maybe this tool, or some pieces of it, could be useful in some instances? Then maybe we'd all give up the dream of being the "next Einstein" who apprehends the mind of God, and we'd make progress towards predicting and influencing the future, which is mostly all science can do for us anyway.
yes indeed!
Stopping looking for one thing does not make you more efficient or successful at looking for a completely unrelated thing. There's no reason not to look for the both the practical and the truth.
What if there can't be a "Theory of Everything", and that the nature of the cosmos is such that it can't be understood or described from a single viewpoint or mathematical expression? The obsession with a TOE seems unreasonable, given that we are unable to even correctly estimate the number of stars in our own galaxy.
You mean a developed oriented towards people's necessities? Where have i heard that before 👁
@@SabineHossenfelderphysicists are working on this, though they are not working with wolfram. It's called constructor theory. Time, space, particles follow certain simple rules defined in contractors and the system evolves according to the "simple" rules and constraints encoded in the constructor which then runs in a program.
I've been listening to the Lex Fridman podcasts where he talks to Wolfram at length about the Wolfram Physics Project. As a programmer myself I found the explanations really clicked for me. Very exciting. I've been waiting to hear your take on these simulations.
Thank you for yet another video which, while it fascinated me, was way beyond my comprehension.
I am tempted to cheek out Brilliant.
Given how tired everyone is from looking at relativity and quantum mechanics copulation attempts, this is a breath of fresh air in fetid sewers.
they are tired because the problem is hard and Wolfram and co do not come closer to explaining it merely by showing that a whole new approach is possible. he does not even make unique predictions that are potentially falsifiable
Well good luck since "Quantum" everything is just made up BS like string theory.. can't make sense out of something that isn't real.
@@shmuliknemanov4009 That's bit of a "darned if you do, darned if you don't" situation. Would he seem any more credible if he did make unique predictions using novel methods? No, and there's already plenty of papers out there that have been torn apart for doing so. Because it has to first start out showing that it lines up with observations that have already been made, to show it isn't just some unserious attempt at attention-seeking from an amateur scientist.
@@Vaeldarg i think your definition of unique prediction is not what i had in mind. by unique predictions i mean the expansion of the explanatory power of a current scientific model. this happens when more facts can be modeled with the same number of or less rules. once this is achieved it is not such a trick to point out that an alternative theory can also do the same expended modeling. it needs to to account for even more facts then the newer theory Wolfram has not done this. therefore he contributes nothing to science. instead he is making a -not very original point- about the philosophy of science.
Aren't all sewers, 'fetid'? Did you really need to add that?
05:35 Best Abstract Ever.
Indeed, she's simply unique.😅
The answer is obvious. the observer needs to be outside space/time to not influence the results. This implies that universal constants are also multivariable. We can already see a hint of this in the speed of light C as constant but only if it's not through a medium. Gravity, atomic forces, quantum effects are all variable. You only need one algorithm to define 3 points, 1. outside observer.and 2 points in space at a given 4d distance.
Way over my head even conceptually….. but somehow very enjoyable to hear about a different way of looking at our “reality” - model to explain how our Universe/ existence works?
Than give Donald Hoffman a shot. That's way easier on surface level and way more mind bending if you willing to listen. No, I'm no way smart enough to comprehend the mathematics behind it, but the idea is very, very interesting.
What a fun vlog today, I liked this one very much. As a former practicing geophysicist and geologist, now an IT guy due to most mines going extinct, when I heard this talk about Wolfram, it caught me how many similarities there are between his graphs and late Stafford Beer's "Viable System Model" (VSM). Beer created it to help Salvador Alliende and his finance minister to reorganize Chile, however that went very south. However, VSM is big in Informatics, to describe system thinking. Yes, it has nothing to do with either physics or quantum physics, but there is a twist, as described by Patrick Hoverstadt in his "The Fractal Organization." VSM is fractal, something that reminds of Wolfram's ideas.
And VSM can be used in nature science, though only some have observed it. Two that have done so are the two social theorists, Lave and March, who, in their seminal theory-building book, "An Introduction to Models in the Social Sciences," used how a sedimentary profile can be analyzed as a model to build social theories. As I read this, it became apparent as a former geoscientist and amateur astronomer that I could use VSM to describe the Universe from the Big Bang to Earth's biology as VSM fractal objects, similar to what Wolfram seems to do.
Absolutely not the intention of Beer, but I have used it several times in IT lectures because it is easier for the students to realize the total involvement and relations of everything and to explain the Universe in astro outreach. Works perfectly, though, with no Lorentz limitations or so. Maybe Wolfram and his cooperators are on to something. 😉 Thanks for this vlog.
I'm a developper and Latin American leftist, yet had zero idea about mr Beer, his work, nor his collaborations with Allende. I went onto a deep rabbit hole search after reading your comment, and learned an enormous deal from it. Thank you so very much
I'd like to hear the conversation between Gorard and Hossenfelder. Although I don't understand a thing he is talking about, I am convinced that just listening to him can make you smarter.
This is so much more interesting than the usual UA-cam recommendations.
There is so much observer dependence in QM and GR, that sometimes it begins to look like some kind of multiverse. I'm in my region of the multiverse, and you are in yours. There are ways to map what I experience in my observer centric universe into what you experience in your observer centric universe. Our universes aren't isolated from each other as long as we occupy communicating regions within our light cones. Just a fanciful way of looking at things...
Always been a fan of both Wolfram and Sabine and finding this intersection quite inspiring!
I'm a software engineer, and while I'm no expert in graph theory/algorithms, it makes intuitive sense to represent the "fabric of reality" as a graph. In a video game, fabrics are literally represented as graphs -- or, at least, if you want any good representation of a fabric. The edge of graphs in this sense don't need to have a physical length, they're just an abstraction for the relationship between two nodes -- an observer and the thing that is observed in a sense.
Like I said, I'm not a physicist and so I have no real idea about the laws of nature and what the established body of knowledge states, and I may have misunderstood your critiques, but if space-time can be an emergent behavior -- it being the result of a calculation based on two related points of the graph -- then the act of calculation being the thing that produces space-time would imply it's not constrained by it, no?
I'm not sure it being an iterative calculation is a good counter-argument as computers are iterative due to the nature of the constraints placed by the physical world. That being said, I think video games may be a good advocate for an iterative calculation as we have tricked ourselves into thinking the images displayed on the screen are a continuous stream of data when they're really just fast-moving still frames. We may not have sophisticated enough measuring tools to say there isn't some iterative calculation. Maybe that is why there is a speed limit to the universe -- the speed of light is simply the "frame rate" or "clock speed" of the universe over a given amount of time/space.
I think this would also make sense as to why things tend to behave as waves, also. As each point of the graph observes the computation, it then communicates this change to the nodes it's connected to causing a pulse-like behavior. I have no idea how this would explain the "spooky action at a distance" that QM does. It could be that there is a connection between every single node in the graph and that only specific types of computation affect a select few connections of those nodes. From a computational perspective, however, that's very resource-intensive. Having not only X^2 (I think?) connections in the graph would take up a huge amount of RAM, but filtering those lists to only affect (an assumedly) small and specific subset of neighboring nodes is computationally-intensive. But that's also from the perspective of traditional computers.
I'm no physicist, so this could very well by just naive rambling, but it does make intuitive sense from a computational standpoint.
If the world is computational anything that is turing complete should be able to do the trick
You then might be interested in physicist Tom cambells work
> it then communicates this change to the nodes it's connected to causing a pulse-like behavior.
That's exactly what's happening. And an object moving in relativistic speeds can't be updated as quickly, and thus it experiences time dilation.
This says that time does not really exist, time is simply the measured rate of change of things.
This also explains the upper limit for speed, which is the speed of light: pulses from one node to another can only have a certain fixed speed, thus creating a maximum speed of propagation of state between the points.
> I have no idea how this would explain the "spooky action at a distance" that QM does.
QM is wrong, spooky action at a distance does not happen. The particles have mechanisms inside them which alter their properties proportionally, if they are 'entangled', i.e. created from a common source. That's why there is a correlation measured that breaks the Bell's inequalities.
If you try to create a program that simulates entanglement, you will realize that the modification the particle happens when the particle is in flight.
Not that in throwing in my hat with Wolfram's hypotheses, but Einstein's relativity, particularly General Relativity, is a bag of problems in and of itself that requires a lot of... massaging... to get it to describe reality as we believe it is. So if compatibility with General Relativity is the hangup, why bother trying to make it compatible? GR isn't etched in stone. It, like all mathematical hypotheses, is a logical approximation. By setting it aside entirely, another hypothesis may (or may not) find a better set of ideas.
We know that GR is missing some pieces, and we hope that someday it will be set aside in favor of something else, but that only happens when a new theory can explain *everything* that GR explains plus new things that it can't explain. I think Sabine was saying that Wolfram's previous stuff (the one where the graph lines have lengths) can't explain things that GR does explain, therefore it's not a candidate to supersede it.
I can't see how it's reasonable to describe General Relativity a "bag of problems" that "requires a lot of massaging to get it to describe reality as we believe it is." That just sounds like bunk.
GR is THE best model we have at cosmological scales and thus far the ONLY model which describes gravity accurately with any relationship with experimental and theoretical physics that we know of.
It would not be wise to discard it as a "mathematical hypothesis that can be set aside entirely."
@chriscurry2496 What is "space-time"? Aside from a mathematical coordinate system that is. If light is traveling as a wave, in what medium is that wave propagating? I'm not asking for formulae or specific values. What is it? Just another name for 'aether'? Or is there a fundamental physical 'thing' we call 'space-time'? I'm not attacking GR. Just because we have a formula to approximate reality doesn't mean it's an actual description of it. One of the chief problems with "science" as respective fields of study in modern academia is that it creates fan-boys who are emotionally incapable of accepting either shortcomings of their fields or tolerate the possibility of falsification, however 'falsifiability' is a fundamental requisite of "science". If an alternative hypothesis doesn't instantly disprove EVERY aspect of something it is immediately cast aside and ridiculed. GR, the Standard Model, evolution by natural selection, these things didn't reach their current potential in a vacuum or instantly or without DECADES upon DECADES of further study, hypothesizing, calculation, and observation done by countless people. 'Theories' don't spring up fully formed. In many cases, GR returns a divide by zero error. Every 3rd grader knows that you can't do that. So we've had to come up with ways to cope with various kinds of anomalies. Setting GR aside for the sake of pursuing another potential hypothesis isn't an attack on GR. It's admitting we don't know everything and a new direction may yield new information.
@@leandercarey
"What is space-time? Aside from a mathematical coordinate system that is."
I'm going to tell you the right answer, but you're probably not going to like it or accept it. And if you can't, then we're pretty much at an impasse. In fact you and modern science is at an impasse ...
So you've repeatedly asked what this space-time "is" by explicitly rejecting mathematical descriptions. This reveals, if I'm not mistaken, a deep bias which will probably make reality outside of some simple stories about reality allude you. For, what kind of answer could possibly be acceptable to you if you reject the one which describes, with real number precision, with geometrical precision? Like, someone could describe it with an analogy--say, I could tell you that space-time is like a spongecake, and it warps and bends and it's curvature is what we call "gravity." But that's not REALLY how it is. How it REALLY is, is exactly how the mathematics REALLY describes it--that's IT. That's as precise and as real and as fully fleshed out in all its predictive power that you can ever get about ANYTHING.
If you reject that, asking for what it REALLY is (but not with some functions or linear operators--puh-lease!) is like asking what a car really is without the engine and the frame and the windshield, and etc etc etc.
And because you're apparently not so fond of math, let me defend the singularities inherent in General Relativity ...
You see, it's not that there are "divide by zero" issues. There's a field that EVERYONE should take or be taking called Calculus that shows you how to handle that with another "evil" mathematical concept called "limits." You can apply these limits to differential equations and find that you really don't have to divide by zero ANYWHERE within Einstein's beautiful partial differential equations (despite their truly evil non-linear nature).
And what you find then is that the singularities are a thing of beauty in and of themselves. They show you where the theory breaks down, and needs more. In NO WAY does that justify "replacing" relativity (you went on about "fan boys" or some noise and I didn't really care to follow that closely). In fact, it's a strikingly rare case where Einstein's theory actually predicts spots where one should look to advance the theory, because it's insufficient (ONLY in those spots, you see?).
Even if we WANTED to replace GR, which we don't, then nothing even comes close to doing what GR did, does, and will be doing in the future for a long, long, long time. It is very likely that General Relativity will NEVER be replaced, even if and when we find solutions to singularities which require enhancing the theory, or enhancing Quantum Mechanics or something like that. It's just that good and useful.
@@leandercarey
--- One of the chief problems with "science" as respective fields of study in modern academia is that it creates fan-boys who are emotionally incapable of accepting either shortcomings of their fields or tolerate the possibility of falsification, however 'falsifiability' is a fundamental requisite of "science" ---
So I wonder where statements like this even come from. Where are you getting your information about how "science" works in "modern academia"? To be honest, this sounds like some kind of parody or nonsense generated by social media where you don't need any sort of standards for just telling true stories.
--- If an alternative hypothesis doesn't instantly disprove EVERY aspect of something it is immediately cast aside and ridiculed. ---
HUH? In what world does THIS happen?
It honestly sounds like you've been huffing that Brett Weinstein bullshit. Everybody should seriously stay the hell of that. It just leads to nothing but a bitter headache and hating everyone, apparently.
-- Setting GR aside for the sake of pursuing another potential hypothesis isn't an attack on GR. It's admitting we don't know everything and a new direction may yield new information.---
Again, dude, I honestly don't know what you're even suggesting here. You can "set GR aside" all you want. Have at it. I'm just telling you that at the end of the day, ignoring it isn't going to help your theory none.
You listened and made the video! Thank you!
Obviously, I don't understand Wolfram's idea at all, but my intuition is that we can always only think about the universe in terms that are familiar to us. First it was gods on Mt Olympus who were responsible for everything, then we had the concept of the clockwork universe once we perfected clocks. Today people again try to cram the universe into boxes that they are familiar with, be they probabilistic, multi-dimensional or computer code.
From the sound of it, Wolfram's idea seems to be promising. An algorithmic universe has a certain appeal. But that doesn't necessarily mean that it leads to a simulation. Maybe a computer simulation is just the closest concept to reality that we can think of at the moment. maybe the universe approximates an algorithmic universe, just like it approximates a newtonian, mechanical clockwork universe, only slightly better.
Yes!
I'm not sure Wolfram is making a literal ontological claim about the universe being a hardware computer running a simulation like the Matrix movies. Only that it functions in a similar way? Difficult.
Fascinating too.
@@thejackanapes5866 The problem is that people will take a model's metaphor and misunderstand and attempt to literally apply it. QM and "many worlds" for example.
Hi, I am Shon Mardani and this is my Unifying Theory of Everything:
[GOD] Created NOTHING, a Void and Empty Point in Space.
NOTHING Attracts Surrounding Space, this attraction or Pull is the Only Law of Nature and it gives NOTHING its Property to be a Particle which I call the GRAVITATIONAL PARTICLE (GP).
When [neighboring] Space moves into the GP, it Creates its own GP at that Vacated Space which Attracts its own neighboring Space. The high speed of the moving GPs will result in duplicating and creating new GPs which propagate and grow like a Crystal. This growth is in 3 Axes (x, y, z) and 2 Directions (- , +) on each Axis, therefore the numbers 2, 3 and their sum 5 are the Magic Numbers of Nature.
The Propagation or Growth of the GPs Create Closed Cyclic Patterns/Paths/Traces as Locked Loops. These Cycles Create the Atoms/Matter/Existence starting with Hydrogen. Hydrogen Atoms Collect to Form Nitrogen and Oxygen Atoms of the Atmosphere at the ratio of 4 Nitrogen to 1 Oxygen which I call ATMOSPHERIC UNIT (AU), then Gravity rearranges the GPs within the AUs and the Carbon Atoms are made.
Atoms Connect with each other by Overlapping/Shared GPs (single, double bonds ...) to Create Molecules.
Gravitational Particles move/travel within the Atoms and between the Atoms (intra and inter atomic) which I call it GRAVITY CURRENT and it moves in a Circular Path toward its Center of Gravity. On the Earth the Gravity Current is perpendicular to its surface and is moving toward the Center of the Gravity of the Earth and its Connected Atmosphere. The Gravity Current on the Earth's Surface is felt as a Push from above and Pull from below [Earth's Center of Gravity], it feels like being under a shower from a firehose.
Atoms/Matters because of their GP Loops are the Physical Resistance for the Gravitational Current. Resistance to GP Current by Atoms manifest itself and are measured as Weight/Mass, Electricity, Magnetism, Heat/Temperature, Light/Color and all other Physical Properties of the Atoms and they continually convert to each other. GP Current moves from one Atom to its neighboring and connected Atom, this means Conductivity is required for Gravity Current, all its Manifestations and the fundamental Existence of the Atoms.
Gravity Current and all its Manifestations are Measurable which means we can Count the Quantity of its Constituent Units like GPs and the Cycles and Loops of the Atoms which make the Gravity Quantum.
Since the Atoms are the collection of GPs locked in the Loops/Cycles, the Quantity of the GPs in an Atom determines its Shape (Atomic Numbers), and the 3D Positions of the GPs determines the Orientation of the Atoms or their TIME. Gravity Current continually rearranging the GPs in the AU Atoms and changes the Atom's Orientation and results in Embedding or Coding the Time in the Atom, like Stamping the Current Time in Atoms.
The Interactions between the Fundamental Organic Atoms (Hydrogen, Noirogen, Oxygen and Carbon) with Different Embedded Times (timestamps) Creates LIFE. Breathing, Drinking and Eating the Organic Atoms and Molecules with Newer Embedded Times provides Energy to Sustains the Life and the Energy to Move, it also provide Energy to Synthesize Heavier Organic Atoms like Na, Mg, P, S, K and Ca by Living Organisms.
To understand the concept of GP, imagine there is a room full of Marble balls with no friction between them, now remove/make disappear One marble from the middle of the room. The empty space created by removed marble attracts the neighboring marbles to fill in the empty space, however only one of the 6 possible adjacent marble can move in and fill the empty space, the one on the left, right, front, back, above or below. One of the marbles moves into the empty space and by that creates its own empty space and attracts its own neighboring marbles.
The Path/Traces of the Moving/Propagating/Growing Empty Spaces (GPs). We do not need the existence of any matter in the space to move into the GP, it is not the Matter which Moves, it is a GP which is Created in that Space and moves to the other GP. It is like the positive side of a battery which does not exist without the negative side of the same battery, so if you create the negative side, the positive side is created and will consume the negative side unless it is connected to the other batteries.
I have developed this Theory of Everything by studying the latest knowledge in physics, chemistry and biology especially the Periodic Table in the last many years and have done my best to validate it and find any conflict with observed and experimented fact. I appreciate your feedback for me to validate and correct any errors there may be before I get into details, thank you.
Well, hypergraphs are just good objects in general for modelling a problem domain. It's an area of general research with a lot more applicability than we give it credit to, it applies not just to physics, but computer science, chemistry, etc. Hypergraphs are just a good language and offer a generalizable structure to reinterpret existing problems and solutions, as well as use hypergraphs to find new and interesting ways forward.
I appreciate your well-reasoned skepticism. It is so rare these days. Please keep commentary like this coming. I feel that it is a great service to the reasoning public.
When I first watched a Wolfram lecture it triggered me to take a core tenet of what he was saying and extrapolate it into a basic truism: All occurrences in the universe, from origin to today, from quanta to galactic clusters, can be seen as a series of interactive algorithms which build on each other to create an algorithmic universe.
When any two particles join they have created an algorithm that will now react to inputs from their environment in specific (now patterned) ways. As particles continue to join together in higher complexity, they form more specialized algorithms: nuclei and electrons, atoms/elements, molecules, compounds, geology, life, biospheres, planets, stars, solar systems, galaxies, the universe. All is a Structural Natural Adaptive Networked Algorithm. All things remain interactive with each other in grand interconnected algorithms (for example living forest soils). The universe is a grand interconnected adaptive algorithm.
Some Natural Adaptive Networked Algorithms become relatively stable. Examples: The Sun. A black hole. A rock. A marble. There is a spectrum of stability vs complexity between a rock and weather. In high gravity systems like the Sun or a black hole, complex algorithmic interactions are crushed and incorporated into a relatively simpler algorithmic system. (However this is reversed in stars that go nova or supernova, thereby restarting a new cycle of building to complex algorithmic interactions). In solids like rocks and marbles algorithms get locked into a very slow moving and outwardly interacting matrix.
ORIGINS OF LIFE
We can also look to the process of algorithms building and changing in the first biological life (and in the pre life just before it - molecular algorithms).
All interactions between living beings are likewise natural adaptive networked algorithm interactions, but with highly improvisational, almost stochastic complexity in possible outcomes (like weather).
Genes, chromosomes, genetics, are natural adaptive networked algorithms.
Genetics combined with Epigenetics form higher complexity Natural Adaptive Networked Algorithms. (This is the last nail in the coffin of Dawkins' Selfish Gene theory.)
Religion is also an adaptive networked algorithm (manipulated by human designed algorithms/memes).
And thereby we can riff into the algorithms of culture and even computers and computer programs themselves, coming full circle back to where Wolfram started us off.
Psychedelics and cellular automata apparently lead to the same conclusions :)
Very interesting and I agree but I think the breakthrough comes when it's discovered WHY this happens and not only WHAT is happening. But the WHAT contributes greatly to progress.
@@void________ True. But the question of **why** essentially gets us to the fundamental question of what existed and happened before the 'Big Bang' which is as difficult to parse as the question of unifying relativistic and quantum physics. And of course both questions likely have the same answer. The only conception I've been able to imagine so far is that before the 'Big Bang' the only actual physical thing in the universe was gravity itself and what I mean by 'gravity' is a uniform self attracted substance (what could be colloquially described as a gummy magnetic 'plasma') rather than just a 'force' which was at first in equilibrium but then fell out of equilibrium and in doing so began the process of the 'Big Bang'.
Here's the concept:
Picture for the sake of argument at the 'beginning' of the universe a uniform ball of this 'gravity' in a stable equilibrium, which either inherently could not remain stable, or was disturbed in some way.
Parts of that 'gravity' plasma then asymmetrically and differentially separated from other parts and then packed down tightly into sub-particles that were internally convoluted in varied ways which made them differ from other packed down gravity 'sub-particles'.
This separating and packing down, was then followed by periodic uneven-pressure triggered explosions back outward, followed by even more complex separation and packing down, with the entire new collection of **everything** all crunching down together in on itself and then exploding back out (possibly many times in succession). This process could eventually create all of the properties that we now see in all matter/energy (including repulsion) - and in turn the universe as we know it.
I'm a lay person and don't have the sophisticated mathematical knowledge and skills to put this credibly on paper, but I'm hoping that eventually AI might help me do so.
Regardless, any of you who are trained physicists, who read this want to take the idea and run with it, be my guest.
The answer is 42
15.
indeed it is
Never thought much of the movies and that joke is soo lame. Perhaps the books were a lot better. Maybe the joke is just to good for me to comprehend....
@@deathorb to say the movies didn't do the original work justice would be the understatement of the century. the reason why all of the hitchhiker's guide quotes are so famous is because the books were one of a kind and unmatched ever since in more ways than I have the time to explain in a comment. If you're judging the series from only having watched the old movies it is understandable why you don't understand their significance.
But what was the question again?
The most interesting aspect of Wolfram's work is the idea that the complexity of our universe is emergent from basic patterns.
That in iself is worthy of much more investigation.
This came out in Chaos_Theory.
Thanks for the video. But, as usual....meh. I'll go with Einstein.
One of the many things I love about Sabine is her humor. In the midst of trying to explain all this she throws in a 7:10 "while trying not to go insane". It completely disarms you for a moment and then back to the discussion. It allows you to take a mental breath and then back to the subject. Always timely. It allows me to understand more of the physics, not being a physicist, without giving up. Thanks Sabine
I like this because I like all science and now try my best when it comes to science, be as open minded as i can and not to be biased. The universe is what it is. Whether it is perceived in different ways, it can certainly be explained in different ways, each way simply being a language, different from the next.
There's this video of Jonathan Gorard giving a presentation to a bunch of physicists from a few months ago. I found his talk very informative and not too difficult to follow although it could have been a bit more organized. But the other participants in the call really showed a fundamental confusion that must have set in right on slide 1 - given the kind of questions he got. I found this very illuminating and kind of funny. I mean graphs are not exactly rocket science but they were so stuck in their old ways of describing things that it would have taken a few or so to get them to understand even the (rather simple) fundamental ideas. Poor Jonathan stayed extremely polite (he's British after all) but he must have felt like someone explaining the way to the airport for 15 minutes and then suddenly realizing that his counterpart didn't speak English.
3:48 Yippie! feels so good to finally be right in Sabine’s video
Paterns are often descriptive ..not necessarily predictive. Thank you Sabine . 🌹
0:13 I felt that realization in my bones. Nobody really completely knows what they're doing. Might as well try
I do, and if you buy my course, I will teach you how
I do not understand a lot of what you're talking about but I love to listen to you talk And I'm learning things it'll win-win
0:56 just add organoids.
I've been thinking about something akin to this approach for many years. A friend of mine (PhD logic and philosophy, but his undergrad was physics at Caltech - my undergrad was also physics, Harvard) and I thought that one way to approach the preferred basis problem from an Everettian point of view was to consider the idea that the reason for the preferred basis is that it is the only basis in which one can form the sorts of causal relationships (i.e., somewhat closed feedback loops in induced spacetime) that you need for perception and cognition (i.e., life, as we know it). Even the simplest organisms require this sort of causal feedback and we know that the preferred basis is the only basis which has local spacetime. If you think of the universe in causal set terms, one might then conclude that there might be a way to explain why spacetime appears to have the structure it does (Minkowski-like manifolds) because that become an optimal way to obtain the apparent causal relations when constrained in this way (i.e., the requirement that one has to be able to have perceptual/cognitive feedback loops of some kind). In order to work this out one would naturally start with SOME notion like causal sets - causal relations but not *distance* relations (distance would be an emergent phenomenon, not something built into the network/graph).
What does it mater what the observer sees? The object's size or energy doesn't change, just the observer's perception. Relativity should mean that observations depend on the observer, but they don't affect the object being observed, just the perception of it. It's kind of like rumors and facts. Rumors may be derived from facts, warped by the people observing them, but they don't change the facts.
Ok, but then how do we determine what standing perfectly still is, so that we can have a baseline energy for some wavelength?
And why is it that when I drive a car going 20 mph towards another car, going 20 mph towards me, it looks like it's going 40 mph and I feel as though I'm standing still, but no matter what speedy vehicle I attach my flashlight to, I can never get the light to come out faster than C? My light sensors don't see the speed of light from the headlights coming towards me as C+20 mph.
The observer doesn't change the universe, but nothing in the universe has a "true" size. And this is all totally separate from the expanding universe stuff. All we know is no information seems to be able to travel faster than the speed of light. The reason photons can travel that speed is they have no mass. More mass = more energy to accelerate something.
That’s not what relativity is saying, the linguistic term for relative in language is very different to relativity in physics.
That's not correct. In the frame of reference observation it's reality. There is no preference of frames ,that's why it's called relativity.
@@Thomas-gk42This to me says that all observers are equally delusional. I suspect there is a reality that cannot be observed, but physics does not want to deal with this, because it is pretty much impossible if you adhere to the scientific method. Amazingly, what is observable has increased over time, with better sensors and so forth, so there may be a lot of things we have never even imagined, much less observed. Science is not the only path to truth.
@@zunuf I get it that C is the max speed, just like the Plank is the min (length, time, whatever) But why would you need a baseline? it all depends on you and the object you observe. If it's light, it already moves at C, so the only thing that can change is the color. If it's a particle, and both it and you move towards a point of head on collision, if the sum total of speed from you and the particle is C or greater than C you would "see" it moving at C because the information's max speed is C, But that does not change your speed or the particle's does it?
I feel like you're describing imperative programming. But a knowledge graph is a logical structure which isn't tied to doing things step by step.
Glad to see this work is getting the open-minded attention it deserves.
The way I see the work, he’s more or less exploring the consequences of a specific, but simple mathematical axiomatic system - the computation of updating rules on a hypergraph.
The thing about it is that it is capable of expressing any discrete computation. If the universe has discrete properties, then by definition his system can compute it if you find the right rules and initial conditions. The main challenges are finding those rules, and identifying when you have done so.
So it’s a little bit like someone taking ZFC and saying it’s a “new kind of physics” because by developing enough theories you can find some with symmetries that mirror the symmetries of physics. In the end, his mathematical system is probably equivalent to just about any other mathematical system, so the main benefit would be if it is somehow easier to compute or evaluate physics-like theories.
One neat thing about his work is that QM and SR pop out fairly early on as a consequence of making the system more general and removing assumptions. Another is that it’s fairly straightforward to generate and simulate any ruleset.
One area that’s most concerning to me, as far as its promise for physics, is rotational symmetry. I don’t understand how to get rotation invariance from a spacetime composed of a graph. His explanation seems to be that rotational symmetry is a limiting approximation, but as you point out, this should still lead to detectable quantum effects.
Maybe the work resolving Lorentz invariance that you referenced also addresses this?
"Wolfram is a curious case because he's not your average crank." Indeed, a legend in his own mind.
🤣🤣love that comment !
he is pretty smart
@@nickcaruso, unfortunately, intelligence doesn't necessarily correlate with WISDOM. 🧠
@@ReverendDr.Thomasfortunately, wisdom doesn't help with physics
"A legend in his own mind" is a description of the average crank
This actually makes a lot of sense. The issue with progressively completing a row of data in the illustration is that it subtly relies on the intuition that everything is constrained by time as some sort of privileged dimension - you expect to see the information propagate over time, and you assume that the row in front of you is a single dimension with no information contained across the Z axis or beyond. But ultimately any Turing-complete process can take any amount of logic, logic which may 'manifest' as multidimensional models, and represent it in a single dimension. And what happens in a Turing-complete program? Nothing all that special - just a massive tangled web of memory pointers. And if you 'decompressed' and stretched a program out into a shape so that you could visualise the states of those pointers, you have your graph. Our arrow of time isn't necessarily shared by the fundamental logic of reality, it just becomes an emergent dimension, and there's no resulting need to be able to observe the propogation of information across this graph, because it's the other way around. Our experience of time is completely divorced from the relationship between these pointers.
Now, my issue with everything is this: I'm not convinced that this is really a 'theory of everything'. At best, it's a representation of the lowest level fundamentals of reality, however if we make the assumption that anything which occurs is governed by some form of calculable logic, then every physical process is already theoretically Turing complete... and all this theory does is articulate it in a mathematically coherent way. But it doesn't really reveal anything about what the properties of the universe *are*, or bring us closer to understanding the reality etched into this fundamental fabric. It just shifts how we represent information in a wildly complex system, without giving us the tools to make new inferences. It doesn't give us a new model, it's an unfalsifiable conjecture which will always remain possibly correct so long as everything in the universe has some basis in logic, which you can always argue is the case because it's real and it's real *somehow*.
Interesting. If we avoid calling it "time" and then think of it as causal relationships, i would say computations still have this. Even the simple computation of a mathmatical expression has inherrent causal relationships ie. Multiplication must be performed before addition, brackets and exponents before that etc. There is a concept of before and after that if not obeyed, changes the ultimate result of the computation.
And its these causal relationships thst are caputured in the networks/hyper graphs.
And i think sabine woukd agree with you that there is no rule that says the universe must have one single theory if everything. There could be several discrete sets of rules that dont overlap.
However if we look at the observation thst nature seems to prefer the lowest energy states than an Occam Razor type approach can be attractive. If we had to choose between a single fundamental structure to reality by which everything elese emerges naturally, OR several different seperate fundamental systems that somehow interact but dont contradict each other. Which one seems more likely?
Noe unlikely thingd happen all the time, there is no rule that says irs only always tue most likely outcome thst occurs. BUT the fact that its the simplest explanation does have a draw to it.
@@aggies11 The order of operations in mathematics is just a convention tied to the syntax we use to represent expressions, but they're not inherent. You can devise an equally valid way of writing things down in such a way that multiplications follow addition, but to retain equality you'd have to rewrite it all. Additionally a causal relationship, at its lowest level, is a reversible operation, so there's no inherent direction to the flow of information across a graph. If I were to draw out a graph with three states: 4---5---6, you might see it as an upward progression, but that's not a property of the graph, that's just a result of reading it left to right and inferring that we're adding one each time. Someone who speaks Arabic or Hebrew might look at it and infer the opposite. Both are right. That's what I meant by the arrow of time as we know it being a distinct phenomenon to the flow of information as dictated by the absolute lowest level of reality, however there would still be 'an' arrow of time across which a lot of the logic unfolds independently of time as we experience it unfolds. I didn't mean to suggest that Wolfram's framework for devising models of reality could coexist alongside others, as that would contradict the nature of his entire theory.
@@minikame2272 ah, ok, then we were in essence saying the same thing. For me it wasn't the direction of the "4 5 6" in your example, but rather that there is a causal relationship between them (ie. 5 must always come between 4 and 6, we can't start or end with it).
Hey Sabine, I love your videos! You are probably aware of the equivalence between different well-known models of computation (Turing machines, Cellular Automata etc.) that require a global or synchronized clock. There is a less-known approach to computation based on interaction nets (or interaction combinators) that model computation as rewriting a graph structure which is similar to rewriting expressions in lambda calculus. However, interaction combinators allow asynchronous concurrency, meaning that the graph rewrite rules can be applied on different parts of the graph and the order does not matter since the computation and final structure stays the same -- similar to how in relativity different observers may not agree on the order/simultaneity of events but the physics stays the same. I hope their research is successful or creates useful technology.
Yes! Thank you, Sabine, the energy carried by photons _is_ observer-dependent! I picked up on this after seeing an interview with the indomitable Sir Roger Penrose (a man who must surely rate as one of, if not _the,_ greatest living physicist), but, like so many other things in physics for the avid layperson, it's always good to have an idea you've picked up on made more concrete by another practising physicist. And with her little digression about photons Sabine did just that for me, and literally only days since I started to contemplate it... satisfaction!
I was stunned when I heard Sir Roger mention that in the very early universe when energies were enormous and high-frequency photons were blazing around everywhere, there was a problem with this picture. The problem was that if you could zoom along next to one of these photons, then you would just see a regular, low-energy photon, and not one of the super-energetic ones that were supposedly carrying much of the universe's energy with them!
This wasn't what concerned me; however, what I was thinking was that you, me, and every other living thing on this planet owe our existence to the low entropy of the Sun and, more specifically, to the high-frequency, energetic photons that we receive here on Earth. Plants then convert some of that energy through photosynthesis, and through that conversion, they grow in size. We then eat those plants, or else we eat the animals that eat those plants, and then our cells use the energy stored in our food to create ATP molecules, and we then use those ATP molecules to create an energy gradient - to keep ourselves far from equilibrium - and this is how we all struggle-on, somehow, however unlikely or difficult it may be to keep fighting the relentless march off entropy, we do it. You, me, and every other living thing on this planet owe our existence to the fact that the sun is a bright, hot object in a cold, dark sky. This is what keeps us going, this is the _only thing that keeps us alive!_ It's the _difference_ that counts. It's the energy gradient. That's what gives the Sun a lower entropy than the space around it, and we then use some of that low entropy in order to keep ourselves as far from equilibrium as we can, because equilibrium is high entropy, and high entropy is death.
Of course, we don't actually _gain_ energy from the sun, because if we retained any energy that the Sun gives us each day, then the Earth would become uninhabitable in fairly short order. And so all the energy that the Sun provides during the day is emitted back into space at night - for every high-frequency photon that enters our atmosphere, 16 low-frequency photons are radiated back into the dark depths of space at night.
And _this_ is what I was thinking about - if we owe our very existence to being able to harness the Sun's high-frequency photons to create an energy differential, but those photons only have that concentrated energy, that energy that we _need_ in order to stay alive, because of how we _see_ those photons, not because of how they inherently _are,_ then, well... isn't that just a bit whack?!?!
Your heart is beating right now purely because of the way we look at photons arriving here from the Sun. Nothing more, and nothing less. Anyway, _I_ thought it was whack. In fact, I thought it was absolutely bat-shit crazy! But then discovering that we are all being constantly accelerated _up_ into space at 9.8 m/s² by the internal pressure of the Earth, and that this effect is what we mistake for gravity near the surface of the Earth, wasn't the easiest concept to get my head around either!
Physics is weird, the _world_ is weird, living in the world is _weird,_ and really, the whole damn show is _weird._ I mean it's very weird... we are alive because of nothing more than how we look at the sun... it's bloody-well bat-shit crazy!
Anyway, have a great, sunny, day!
I too, praise the sun. Nice read, thanks.
" energy carried by photons is observer-dependent!"
not really the energy of the photon is fixed. Its just the energy released is dependent on the differential velocity between the photon and the object. The variable is the objects speed in relation to the photon.
Sabine's Example about the photon is a poor one as it does not reflect Wolfram's model.
Saw Sabine and Roger live together on stage, the HTLGI-festival in London, last month. The two most honorable scientists, humanity can count on currently. Great event.
"but then discovering that we are all being constantly accelerated up into space...and that this effect is what we mistake for gravity.." It's almost like you're simply wrong about how a lot of this works, but through confidence bias you're able to tell yourself otherwise. You wrote this either hyped up on meds, or in need of them.
@@Vaeldarg Well, you're half right at least; I mean, I _was_ hyped-up after seeing the video, and, in my defence, it _was_ about 4 am when I wrote it, so, you know... a little overtired, perhaps.
However, as to your primary objection here, namely, that I was factually wrong about a physical process, well, I'm sorry, but on this point, it's you who appears to be wearing the scrambled eggs. And quite a lot of them too, I might add!
I can say this with the utmost confidence because a.) it has been a very well-known and understood physical phenomenon ever since Einstein's general theory of relativity was first published. All bodies in contact with the surface of the Earth _are_ being accelerated _outward_ at 9.8 m/s², and, b.) there are a plethora of presentations, videos, and lectures out there on the interweb (scienceclics presentation on general relativity is particularly helpful) that support this fact, including even one that Sabine herself released recently, and, c.) you don't have to take my word for it, you can determine this for yourself in just a few minutes. Here's how:
• Climb up on your roof and attach some bathroom scales to your feet with gaffa tape. Look down and observe your weight. Now jump off the roof and you shall see that you weigh... nothing whatsoever. This might seem strange if you are being pulled down by the "force" of gravity - for if you weigh nothing at all, then what mass is gravity acting on?
• Take any two objects you like and (accounting for air resistance) observe how they _always_ seem to fall at _exactly_ the same rate. This was an effect that was never satisfactorily explained within Newton's account of gravity, but when you realise that when any two objects are let go they _do not move anywhere in space,_ and that it is the surface of the Earth itself that rushes up to meet them at 9.8 m/s², then it becomes immediately apparent why this effect occurs. Because _of course_ any two objects will _appear_ to fall at the same rate _if they aren't actually moving at all!_
• Ok, I can sense that you still aren't fully convinced yet, maybe, but this one's a clincher! Your mobile phone has an accelerometer built into it, so simply download an accelerometer app and look at the reading! It will tell you, and I say this with all confidence, that you are currently being _accelerated_ at the rate of 9.8 m/s² (give or take a hundredth of a decimal place or two)! And you can't be going _down,_ can you? Then, drop your phone from a small height onto your bed and look at the _lowest_ reading; it will tell you that it was then travelling at 0.0 m/s, and your phone, as you saw it "fall", went absolutely _nowhere!_
And so... viola! Now you know!
Look, in all seriousness, I don't blame you for doubting the bona fides of some nutcase in the YT comments section, I mean, there _are_ just a few of that sort around! But I don't make statements here unless I'm absolutely sure about what I'm saying, and I can tell you now that I'd put my very life on this, altogether incredible fact being, in fact, altogether credible! It _does_ require a small slap to the face from our old friend cognitive dissonance, and a little bit of quiet reflection, before it all sinks in, though. I must have spent about four months researching the issue before I came to finally believe that what I had heard was actually, really, factually, true. That said though, I didn't think of using my phone to test it back then, so... and I must have spent about nine months trying to convince my partner that this was her reality. She eventually said that she "got it", but I have my doubts about whether she ever truly did! I admit, it's nuts, it's totally nuts, but when _you_ get it, you'll never look at the world in the same way again!
Post Script: there _is_ an effect of gravity near the surface of the Earth, but it's minute. It would take, for example, around three hours for a dropped pencil to reach the ground if only the "force" of gravity were acting on it.
Afterword: if you think about it, gravity _can't_ be holding us down. The ISS orbits us at around 100 kilometres above the Earth, and the gravity at that height is something like 97% that it is here on Earth's surface. So, if it _was_ gravity that was acting on us, then how could the astronauts just float there... almost as if they _weren't moving anywhere in space?_
Physics is a trip! Have a great, accelerated, day!
2:16 This is true for Turing-complete systems like classical and quantum computers, but what about for systems capable of hypercomputation like real computers? That would (in theory) allow for computation with infinite precision, removing the need for discretization. Hypercomputation doesn't seem to be physically possible in our universe, but if we're talking about the simulation hypothesis anyways, then what if it's possible in the physics of the simulator's universe?
We've had the concept of infinite precision with calculus for about 400 years. Binary arithmetic, and graphing, to my knowledge does not yield infinite precision; Analog computing and its numerical methods would come closer.
@@johnrieley1404 Something struck me as very wrong about your claims about calculus. And I think it is this: Calculus is not designed to produce infinite precision. It’s designed to produce functions that allow arbitrary precision. No matter what function you end up with, it still must be supplied by the user with infinitely-precise real numbers-which of course are practically impossible absent hyper-computation, as OP stated.
@@galladeguy123 I think what you’re really asking is whether the universe is truly continuous or not.
As someone who likes graph theory, I see this as an absolute win. As someone who had to look up the ideal gas law, i have no idea what you're talking about.
4:42 there's a way to fix this problem with computation.
In reality computation takes time, that creates basically a 1:1 relationship between the speed of information being computed and the size of the objects.
Aka, the faster you go, the more cycles you take from the Universe's CPU, the thinner you are going to appear to another observer, space itself becomes distorted because you have different computations happening at different speeds, and the length of something is defined to the time of computation because that's how you calculate how long something is, the amount of time light takes to go from one point to another, that means the amount of cycles it takes, but the amount of cycles remaining to the computation changes. Obviously light always go at the maximum speed, the clock rate of the computer, but the particles that are not mere simple information take time to be changed, thus they will appear smaller.
"c" is mere the cycle clock of the Universe, so when you go fast, you basically consume all the cycles, and if you go at the speed of light "c", you basically stop moving and become so thin you disappear, that also solves the problem with singularities.
@@luizmonad777 FINALLY! Someone else has the same idea as I!
Yep, most people have no idea that CPUs have branch injection, micro-fusing, are superscalar and have out of oder execution.
I don't understand why Wolfram's approach is considered so non-standard that physicists ignore him. Isn't this the exact same thing string theorists have done for 30 years? Neat ideas, zero backing evidence. Making a theory first, going looking for evidence after?
I know it's ass backwards as science goes, but if string theorists can do it, why can't Wolfram?
Maybe, because he is independent (social, intellectual and financial) and therefore able to play its own game - not theirs. So he is an outsider by definition. But I think he would be not the first outsider to bring science forward (see T.S. Kuhn: The Structure of Scientific Revolutions)
You have to give the string theorists some credit. The approach was motivated by real physics.
String theory is not experimentally testable, but the principles are very much physically motivated.
@@maclypse lol, String Theory is much closer to physics than Wolfram’s model (at least currently-and for the foreseeable future). Whenever I read of someone bashing String Theory I know they are just haven’t really studied physics.
Physicists have relied on "theory first" work done by mathematicians for centuries. How many advancements in physics were facilitated by some abstract math, developed decades or centuries prior, wherein the physical is a special case? Wolfram's work sits somewhere between pure math and theoretical physics. He and his team are working to build a framework wherein the patterns of established physics emerge naturally but should also provide some additional predictions. They are still in early stages. AND, unlike string theory, he is funding it himself. He is not wasting everyone else's time with promises of experimental verification right around the corner or, conversely, never.
Wolfram. Great name. I've heard him talk and in my opinion he's intoxicated by the exuberance of his own intellectual verbosity. He's a business man and has to sell his ideas no matter if they are true or not
Since the universe is so old, it presumably runs on a mixture of FORTRAN and COBOL with a bit of assembler for the singularities.
On a serious note, I'm concerned that computer scientists seem to be trying to fit reality into computing rather than vice versa. String Theory seems to have uncomfortable echoes of how hard drives work, for instance. The fact we can make a discrete model of something does not mean the something is discrete; ask Ptolemaeus, Copernicus and then Kepler.
It's easier to accept that the universe is limited, rather than us.