And surprisingly, there are actually algorithms that your conventional Von Neuman laptop can do (or it's vectored processing GPU) that we simply haven't found a way to use a quantum effect we know of do. Most problems simply can't be done by a quantum computer at all yet. IBM is trying to push the envelope on a general purpose quantum computer but really a quantum computer as adaptable as our current computers are still not even on the horizon... This announcement is just that they figured out how to use a quantum effect to perform a certain type of calculation that would take conventional computers a very long time to do and just shows that we don't have a general purpose quantum computer yet.
TL;DR version: D-Wave's quantum computer (which has been criticized by some in the scientific community for working more like a "quantum simulator" due to its low programability compared to IBM or Google's devices) solved magnetic systems of scales previously unsolvable/extremely time-consuming (an entire epoch, in fact) by conventional computing, they did this to prove that their device works for quantum computation involving quantum simulation and that this is a huge leap forward for their devices
Nice reference to Neo's "There is no spoon" D-Wave has a history of making such claims that they have done a computation that would take conventional computers excessive time, some of which have been subsequently debunked by example. In their early days, Scott Aaronson was skeptical. I hope he takes notice of their latest.and weighs in.
The funny thing about Scott and the other theorists who criticized D-Wave, is that they all got on the NISQ hype bandwagon when the field became lush with money. D-Wave pioneered the model of quantum overhype and then Google, IBM, etc picked up the torch around 2018. "oh but these are gate model machines, at least it can run Shor's algorithm one day!" (assuming a decades worth of developments that are not working yet). They criticized D-Waves baby steps and then turned around to expect applause for Google's baby steps.
It's actually the daily work of chemists. If they want the data of a so far not existing molecule, they synthesize it. It will then solve the exact Schrödinger equation in a couple of picoseconds. Then you can read out the result by NMR or so. The example of the spoon was just a tiny bit hyperbolic.
If people are saying the Ising Model is a specific case that may not be relevant for real world computing, please note that the Ising Model is the basis of the Hopfield model which is a model of neural storage of associative memories in neuroscience. Quantum versions are predicted to have MUCH higher memory capacity than the classical version which only has capacity linear in the number of neural units (magnets).
Exactly again that idea of specialized purpose for memory. It’s why our brain separates cortexes based on function and purpose. Or why we keep multiple drives to separate our work on a computer. Essentially using a more organized memory structure to use memory space as optimally efficiently as possible. Also why we store memories that are associated in physical neuron proximity. Helps them fire off of each other
Actually, associative memories modeled after the Hopfield model have a capacity that scales less than linear in the number of units: O(N/log(N)), where N is the number of units, to be exact.
Solving the Ising model is an NP-complete problem. By definition, every NP-complete problem can be mapped to each other, so solving one solves every other. This is why it so relevant.
In classical computing, there is the analog computation, where for example a dial is twisted by one process and the machine spits out a result elsewhere after electrical signals have been manipulated. Not too different from the truly classical manual computer, where someone did calculations by hand, say, looking up as precise as possible a value on a graph. Because of these two historic parallels, i'm inclined to actually call this a calculation.
I agree but from another perspective: it takes numerical input (be it a configuration of qubits) and gives a numerical output, so it's a calculation. Simulation can also be, and often uses a lot of, calculations.
@@user-sl6gn1ss8p That is exactly what it seems to be, and truth be told, that can be extremely powerful given a question worth building such a system for
@@user-sl6gn1ss8pQuantum computing does not have to be general purpose in order to be disruptive. If you and I could make a quantum SHA reverse solver that can almost instantly defeat SHA and can only do that, that alone would make us billionaires. A device that instantly finds high primes would be another example of a non-general-purpose, yet very disruptive technology that could be achieved this way.
I think this is mostly just semantics. If it takes an input and produces useful and non-trivial output it is a computation. I think this is in the same class of computation as the DNA "computer" which can perform very NP search and mapping problems by "annealing" millions or billions of macromolecules in parallel at molecular speed. If the Everett interpretation is correct, this is pretty much how "Qbits" do it, just faster and parallelized over a chunk of a multiverse. Are both of these devices computers? Not if you definition of computer requires a Turing complete machine.
Even in conventional computing, annealing is a very interesting and cool way to solve problems. I've used simulated annealing myself, to optimise patterns in steganographic art. It's actually a lot of fun!
It’s also been found that growing slime moulds can solve travelling salesperson type problems as an alternative to simulated annealing in digital computers. But it’s much harder to raise venture capital funds to grow slime moulds, and also much less sexy for clients to look at than all the quantum plumbing. [edit: This was meant to be irony, focused on the fund raising activities of quantum startups. But evidently irony isn’t everyone’s cup of possums juice]
Except slime molds solve the problem by simultaneously traveling through all paths and then start optimizing the most successful paths. It's not really a novel solution that classical computers can't perform. The problem is that it doesn't scale well. Which is why it's a problem in the first place.
@@obsidianjane4413 Fair enough, just figured on a video about quantum processing, a little clarification about slime mold computation might be appreciated for those that don't know.
The definition of "computer" is an example of the imprecision of human languages. Computer was once a job description, of a person (usually a woman) who ran hundreds, to thousands, of math calculation during her work day. When applied to devices, it covered mechanical, and electronic analog computers, before being applied to digital computers. Some with the calculations "hard set" in their design. Others reconfigurable, or programmable. I've used analog computers back in college, which is "programmed" by wiring up a circuit. I've also made a living programming microcontrollers, which are "hard coded" most of the time to specific functions. Even though some (the old HC05) have the von Neumann architecture (most are Harvard), will you exclude them as computers, when their app is hard coded ? Language gets political, like that !
@@michaelmoorrees3585hard coded on a von neumann architecture is an oxymoron. Unless you are going to try the semantic dance of RAM vs ROM. A rather brief and futile exercise. The contemporary definition of computer is only sullied by early squabbles (across the pond) of who had the first one (that was Manchester). It's been a while, but I came to the conclusion that quantum machines cannot execute code... A requirement for a processor is to have NOT and AND (or gates to the equivalent). Quantum machines cannot do this. They cannot implement the recursive solution of Towers of Hanoi. What quantum machines *can* do is resolve. The "programming" is then just working out how to model a system in qbit terms.
Define a "computer". Your definition may not be the same as other people's, even if you were to consult a well respected dictionary or textbook. As someone pointed out, once upon a time "computer" was a job description for humans. It's a very imprecise term.
@michaelmoorrees3585 @@another3997 maybe you think you're disagreeing with me? There's a reason I put the word in scare quotation marks. The video is referring to how the word has come to be commonly used these days, which, as you note, isn't the "one true definition."
I think the combination of analog computers with digital is going to be the most powerful option in the near term. Eventually combining digital, analog, and quantum might be feasible - maybe in a few hundred years?
Optimization problems, the kind that D-Wave's approach solves, are some of the most common needs for high performance computing and get used for many things. Nobody has yet proven there is even a quantum advantage for any other useful algorithm. So far, it's basically Shor's algorithm which, outside of academia, is mainly of interest to thieves and national intelligence agencies and I expect both are behind the funding for many of those projects. Given that, I'd say D-Wave's strategy is currently the only real quantum computing. Google can compute arbitrary algorithms on their platform but most, if not all, will be no faster than their classical counterpart and far less efficient.
I'm glad you mentioned the spoon. Before you said it, I was thinking of "10 pebbles in your hand" where you throw them in a box and observe the exact positions down to 15 decimal places and thought "no quantum computer will ever be able to beat its accuracy in calculating/simulating the result of a throw"
My understanding of the discussion surrounding D-Wave being / not being a quantum computer has been that it is not so much the simulation vs computation part, but that the way they implemented the system. While a “proper” adiabatic QC tries to avoid an excited state, the D-Wave annealers happily leave the ground state and essentially rely on quantum tunnelling to bring them back down. Hence the debate whether a transition from QC -> non-QC -> QC is realistic. To date, none of the studies have shown that the D-Wave systems have a true quantum advantage, so will be interesting to see what the effect of this new one will be (at least after peer reviewed).
Yes, you are right that the quantum advantage has been questioned. But think of the spoon. The spoon also has a "quantum advantage". It's just not the sort of advantage that people usually discuss in terms of operations needed etc. I have little doubt that if you have a large quantum system like D-wave it will be faster than a conventional computer at *something* -- they just have to find that something. (And then there's the question whether that will be useful for anything, but that's a question you can also ask about the present-day universal quantum computers...)
@@SabineHossenfelder The "classical" quantum computers that can run Shor's algorithm are also good at something, namely running Shor's algorithm. They are as universal as D-Wave's quantum computer is. So much for one-trick ponies.
Objection. You say "you can't verify the solution because we don't have any computer that solves it". But... the Ising model is (according to Wikipedia, I wasn't aware of it before, but I trust it on this) equivalent to an NP-complete problem. NP-complete problems are well known for being hard to solve but very easy to check a claimed solution. This is one of their core features. A solution to an NP-complete problem is very easy to check by a simple computer. The difficulty is in coming up with it to begin with. So if the Ising model can be translated into an NP-complete problem, then just take the problem they solved, translate it, translate their claimed solution, then have a checker verify it. I didn't know about the Ising model and I know very little about the D-Wave quantum computer beyond what you say here, but as a computer scientist who knows about theoretical computer science, this claim that "because a problem is difficult to solve, it's difficult to check the solution" is outright wrong (and kinda basic in complexity theory), and I think it's wrong in this particular case too. Could you clarify??
@@geordierose3242 irrelevant. Np complete means it's both np hard and np itself. According to Wikipedia, Ising problem without external fields is np complete, and therefore NP, and therefore solutions should be verifiable on polynomial time by a classical computer.
No. NP Hard problems are optimization problems that return real numbers. NP Complete problems are decision problems (yes/no). The former can't be checked (you don't know if you've got a global optimum) but the latter can.
@@geordierose3242 I'm sorry but that's just not the definition of what NP hard means. See en.m.wikipedia.org/wiki/NP-hardness Now regardless of the name, you're right that is you're trying to find an optimal solution then it can both be harder and more importantly harder to verify. I do get a bit lost with the complexity properties of verifying optimality, but given how it's phrased in the video, and especially the argument "we can't check the solution because we can't find the solution" is just very misleading and potentially incorrect in the context of complexity theory. Maybe in this case it is because checking optimality of the solution is inherently tricky, but it should have been said with more care. And your definition of np hard is just wrong.
😂 Spoon analogy is at point! They just built physical model of exact process and measure results. Don't need quantum things to achieve "billions years worth of simulation" - we already have many lab simulators with sensors.
After a teacher or someone says "I'll come back to that later" I'm waiting for that the entire time, and I cannot focus/concentrate on the things the teacher "wants to say first".
Wave Function Singularities, Quantum Gravity, Unified Field: Frequency Wave Theory Expanding upon current theories of singularities with a hypothetical frequency wave theory introduces a novel perspective to understand the fabric of the universe and the singularities within it. Singularities, as understood in physics, particularly in the context of black holes and the Big Bang, are points in spacetime where gravitational forces cause matter to have an infinite density and zero volume. This traditional view, primarily rooted in general relativity, encounters limitations when it tries to describe the quantum aspects of singularities. The introduction of a hypothetical frequency wave theory could aim to bridge the gap between quantum mechanics and general relativity, offering a unified description of singularities. This theory might propose that at the core of singularities, rather than an "infinite" point, there exists a fluctuation of energy or matter that is best described through wave functions. These wave functions could represent the oscillations of fundamental fields or particles at extremely high frequencies, which converge at the singularity. Core Concepts of the Hypothetical Frequency Wave Theory 1.Wave Function Singularities: Instead of viewing singularities as points, this theory would interpret them as zones where wave functions of particles or fields reach extremely high frequencies. These frequencies could be associated with the energy levels and the quantum states of particles converging at the singularity. 2.Quantum Gravity Interface: The theory could provide a framework for quantum gravity, illustrating how gravitational forces operate at quantum scales. It might suggest that gravity itself can be quantized and its effects are mediated through wave functions that interact with the fabric of spacetime. 3.Unified Field Oscillations: At the heart of singularities, the theory might propose a unified field where all fundamental forces (gravitational, electromagnetic, strong nuclear, and weak nuclear) merge. The oscillations or frequencies of this field could dictate the properties of singularities, offering insights into their behavior and structure. 4.Quantum Foam and Planck Scale Dynamics: At the Planck scale, the theory could introduce the concept of quantum foam, where spacetime itself is subject to fluctuations. These fluctuations would be characterized by wave functions oscillating at the Planck frequency, potentially leading to the creation and annihilation of micro singularities. 5.Information Preservation and Transmission: Addressing one of the biggest puzzles in black hole physics, the theory could offer mechanisms for information preservation and transmission through the boundary conditions of wave functions at the event horizon of black holes. 6.Cosmological Implications: On a cosmological scale, the theory could provide a new understanding of the Big Bang and the evolution of the early universe. It might suggest that the universe itself originated from a singularity where wave functions of the unified field began oscillating, setting the stage for the expansion of spacetime. Challenges and Future Directions •Mathematical Formalism: Developing a rigorous mathematical framework to describe frequency wave functions at singularities is crucial. This involves integrating concepts from quantum field theory, general relativity, and possibly new mathematical tools. •Empirical Evidence: Finding empirical evidence for the theory's predictions would be challenging but essential. This could involve observations of black holes, gravitational waves, and cosmic microwave background radiation. •Quantum Computing and Simulations: Advanced quantum computing might offer new ways to simulate and explore the implications of this theory, providing insights into the behavior of singularities and the early universe. This is a part taken from CHAPTER 1 of my recent book: Wave Function Singularities, Quantum Gravity, Unified Field: Frequency Wave Theory Available on Amazon here: read.amazon.com/kp/embed?asin=B0CW1GLHD9&preview=newtab&linkCode=kpe&ref_=cm_sw_r_kb_dp_9VVJSQ61YKT844ZSTZA0&tag=aideas07-20 Images were made with AI's imagination #QuantumPhysics #QuantumGravity #UnifiedFieldTheory #WaveFunction #Physics #ScienceTwitter #Astrophysics #BlackHoles #SpaceTime #Cosmology #QuantumFoam #PlanckScale #Singularity #SciComm #EmergingTech @neiltyson
I feel like it would make sense to call this an analog computer, similar to other types of analog computers where you create a circuit that obeys the same equations as the system you're trying to predict.
The spoon analogy is not so silly. In the 1960s finite element methods were just beginning to be used in continuum mechanics (stress/strain analysis). Those methods were found to have limitations, as they still do today - especially in vibration analysis. Experimental stress analysis comes to the rescue. Basically, you build a device and test it. If you want to know how a spoon works, it's easiest to just make one.
Not sure about my spoons but my car keys seem to have quantum properties. They vanish from existence and cannot be found, only to reappear in a different location. As an older person, maybe if they could be in more than one place at a time it would be easier for me to get my hands on them. Now if only I could remember where I've left my beer I could enjoy the rest of this video.
it absolutely crazy, i have been trying to learn renormalization yesterday and they start using ising model as example, and now you are also talking ising model
Super interesting! Question, is the "Salesman problem" suitable for quantum computers or simulations? The funny thing with science is that it sometimes find applications, after a new "tool" has been made , oder?
You mean the travelling salesman problem? Yes, it's one of the problems you can put on a quantum computer. I believe you can also do it by quantum simulation, at least in principle. However, it's a rather academic problem. There is a class of related problems in logistics that people want to look at with quantum computers. I think the reason everyone is starting with the Ising model is that it's particularly easy to match onto qubits.
@@SabineHossenfelder The travelling salesman is often used as an example of an NP-complete (and NP-hard) problem because it's easy to explain and less abstract than a lot of other NP-complete problems. However, NP-complete and NP-hard problems come up in practical situations all the time though, and if you've come up with a fast solution to an NP-hard problem, you've unlocked many others. In fact, if you find a polynomial time solution for an *NP-complete* problem, you've solved all of them because they can be converted to one-another. The problem is that nobody's been able to demonstrate that quantum computers are any better at NP-complete problems, and it's unlikely to be true. At best, it's speculation, but mostly it's marketing. If it were likely to be true, a lot more computer scientists and software engineers would actually care about quantum computers. Importantly, factoring integers into primes is probably NOT an NP-Hard problem. That's the big one that will break cryptography and everybody gets excited about it. Getting back to the OP's question, "is the "Salesman problem" suitable for quantum computers or simulations?" The answer is almost certainly "no". Like most problems, you can solve it with a quantum computer, but probably not significantly faster than a conventional computer. It's also worth noting that >99% of the time you don't even need an exact solution to an NP-Hard problem and a heuristic (fast but inexact solution) will do. So even if quantum computers were good at NP-Hard problems, they'd still be almost useless if they're at all more expensive or difficult to use. Technically speaking, classical computers might also be able to solve NP-complete problems in polynomial time - it hasn't been proven they can't. If you manage a proof that they can or can't, you will become very famous and win a million dollars because it's a millennium prize, and probably the one with the most practical applications.
You can put it on a quantum computer, just like you can with a conventional computer, but it won't help. The traveling salesman problem is NP-complete, and quantum computers cannot efficiently solve NP-complete problems. Quantum computers, at least the fully capable quantum computers that no one is actually building, solve a category of problems called BQP. BQP is a superset of P, the category of problems that can be solved by a classical computer, but a subset of NP, which includes many famous problems including traveling salesman. The most well-known problem in BQP is integer factorization.
Thanks to you I checked out Brilliant and quite enjoyed it. Getting up to the end of the trial period though and I see that it's €100 a year, that's insane! There are so many good free resources already that I can't justify such a spend. I wouldn't mind if I could pay monthly for that but the monthly cost is something like the equivalent of $25 which is far too much.
It seems funny to me that some people do not think of a specialist machine as a computer. I can't imagine anyone saying that the machines onboard the Apollo rockets we're not computers - yet they could not do general computations of, for example, bookkeeping. Nor could they have played chess because they were hardwired to handle the specific calculations needed fr the mission.
If any type of computation device is a one trick pony that calling it a spoon is pretty darn close. Actually a spoon is useful for many things other than just the normal usage. It sounds like the D-Wave device is configured to solve one problem and only one problem. It might be able to be re-configured but at what cost and what amount of effort? If any project needs an army of technical minded experts to use it, I would say the spoon is more useful.
I did Sabines course. It is really nice but it is short and just the beginnings, finished it in one day. I wish they added more advanced quantum topics in the same style.
Enjoyed the report. Sabine is the best at illuminating progress in physics. Like DWave, A company in San Diego named Mem Computing has been doing quantum simulations and has shown incredibly abbreviated compute times compared to the linear digital computer. I believe there is a standards body that develops standard tests so that these companies that do q simulations can compete. For example there are tests for optimization and for finding primes. I am not an expert in this area but will try to find out more to add to this list of comments.
In the paper at conclusions they claim: "We simulated square, cubic, diamond, and biclique topologies that are relevant to applications in materials science and AI, and are amenable to scaling analysis through the area law of entanglement and universal quantum critical scaling. More challenging and irregular topologies-for example those corresponding to deep neural networks". So if real world application exists and their results are applicable then we can test the quality of those results on the applications. Until this is done we should stay in a quantum state between amazed and anoyed.
Hello Sabine, I do my best to follow your channel, but often get lost in the labyrinth. Imagine yourself as a teacher of a class of 7/8 year olds, and me at the back sticking my hand up and asking, "Ms Hossenfelder, Ms Hossenfelder, it is not how that concerns me the most, but it is WHY?"
I was making a molecule in Caligary true space Legacy Edition, it slowed down horribly at 200 meta balls, The project requires nanobots to go in there and create a nano lithography machine, to make progressively smaller nano machine's. Then you can hold a tesla giga fab in the palm of your hand. Send them to mars to produce 32oz diamond cut acrylic tumblers, just keep stacking the cups to make a little shelter for more rare and interesting materials, color the glass so you are amazed by the artifacts that have been actively arranged and collated.
Wow, this is incredible! I look forward to the day where we all have such a machine in our homes that can compute a very, very, very, very specific thing and nothing else.
Please, Explain time. Why do we say time passes slower or faster depending on your location and/or movement within the universe? Could it be it's just the measurement of time is different? Scenario: Calculate a future spot in earth's movement through the solar system and pinpoint it. Now pick three spots within the universe to observe the earth's movement until it reaches the calculated spot. Observation location 1 is NIST laboratory in Boulder Colorado. Observation location 2 is our moon. Observation location 3 is an orbit around the sun at near light speed. Now have all 3 observation locations start recording the passing of time at exactly the same time. All three observation loctions would stop the time recording when the earth landed on it's previously calculated spot. Would it be illogical to say, the physical amount of time passed for each observation location would be the same? Could it be that the recording device used to record the passing of time showed differently, but the physical amount of time would be the same for all 3 observation locations? Does it seem reasonable that what we need is a way to record time with a device that can account for its location and movement within the universe, so that the recording of time passing is equal on all accounts? The earth will move through the solar system and reach its calculated spot in the future in its due time. It won't be early and it won't be late. It will be right on time according to the calculation. Why would we say the passing of time is different for the 3 observation locations?
I'm really curious if the phrase "There is no spoon" would have got Sabine into some sort of copyright thing or if she just didn't feel like taking the time to google the exact quote and decided to paraphrase. One of life's little mysteries, I guess. FWIW.
7 місяців тому+1
Quantum computing is like fusion power. Great in theory.
nice! ising model was my master project at university in late 90th, with a 25x25x25 grid. It's cool to see the problem is still there 😅! go Ising, crash the new gen computers!
You must walk before you can run, but I decided to become impressed by a quantum “thing” when it decrypts a now unbreakable message within a minute or a second or so, or when that thing finds the provable best move at every position in a game of go.
It is a kind of analog computer, such computers were common long time ago until they were superseded by digital computers. Analog machines were very effective for solving differential equations. The difficulty with this quantum machine lies in finding the proper analog problem to mimic the problem you are trying to solve. I would love to see them solve the Travelling Salesman Problem as a demonstration of a useful application.
'Simulation' kind of implies 'computation', but hair-splitting aside, I think that (specialized) computation via setting up initial configuration of the system, and then letting that system settle into some kind of ground state is a really interesting approach. I may be wrong (and I'm happy if I'm right 50% of the time, especially about things that I have only glancing understanding of), but do I think that I can already see how such approach could be used for extremely fast factorization of public keys (used in public-key cryptography). What one would need is a (dynamically resizable) 2-dimensional grid occupied by fermions (only fermions will do, because of Pauli's exclusion principle), and with that grid occupying vertical position in a gravitational field. Start with the grid size of NxM (N horizontal positions [along x axis], and M vertical positions [along y axis], where N
From the picture of D-Wave devices there are at least two of them. To check the accuracy of one you can run the same task on the other and compare the results.
It's kinda like analog computers. Geared towards very specific tasks and provide fuzzy results that are hard to verify, but sort of match the expectations.
It is like saying that an air tunnel used to test airplane wing designs is a computer. It is just a physical system very similar to the targeted one, so we can extrapolate some results with large accuracy.
If something gives you answers to one or more questions, based on input data, it has been computed. A wind tunnel simply moves "air" in a specific direction, at a speed which may or may not be controllable. It is analogous to reading parameter data from a storage device or keyboard. The tunnel itself doesn't recognise the presence of a wing, and it doesn't react its presence. That is different to the simulation within a quantum computer.
@@another3997 I meant the wind tunnel plus the particula wings, and the computation would be the lift, etc. the qbits just simulate quantum interactions, and the configurations need to change to emulate different systems.
All kinds of stuff takes my laptop millions of years.
it still cannot brew a decent cup of coffee.
ROFL
I heart you too, Sabine! ❤️
School Laptops after trying to open word: millions of years to come while my screen goas white and my user red
And surprisingly, there are actually algorithms that your conventional Von Neuman laptop can do (or it's vectored processing GPU) that we simply haven't found a way to use a quantum effect we know of do. Most problems simply can't be done by a quantum computer at all yet. IBM is trying to push the envelope on a general purpose quantum computer but really a quantum computer as adaptable as our current computers are still not even on the horizon...
This announcement is just that they figured out how to use a quantum effect to perform a certain type of calculation that would take conventional computers a very long time to do and just shows that we don't have a general purpose quantum computer yet.
TL;DR version:
D-Wave's quantum computer (which has been criticized by some in the scientific community for working more like a "quantum simulator" due to its low programability compared to IBM or Google's devices) solved magnetic systems of scales previously unsolvable/extremely time-consuming (an entire epoch, in fact) by conventional computing, they did this to prove that their device works for quantum computation involving quantum simulation and that this is a huge leap forward for their devices
Nice reference to Neo's "There is no spoon"
D-Wave has a history of making such claims that they have done a computation that would take conventional computers excessive time, some of which have been subsequently debunked by example. In their early days, Scott Aaronson was skeptical. I hope he takes notice of their latest.and weighs in.
Yes, that's right. Then again, you know, I don't think they're doing this on purpose. And I guess we'll see about this one...
Do you mean that they are bending the spoon??
Hey 2 minutes ago woot lol @@SabineHossenfelder
The funny thing about Scott and the other theorists who criticized D-Wave, is that they all got on the NISQ hype bandwagon when the field became lush with money. D-Wave pioneered the model of quantum overhype and then Google, IBM, etc picked up the torch around 2018. "oh but these are gate model machines, at least it can run Shor's algorithm one day!" (assuming a decades worth of developments that are not working yet). They criticized D-Waves baby steps and then turned around to expect applause for Google's baby steps.
They can not make any claim unless they have a mouthpiece in their science gang.
How the magnet and qube and quantum are computing?
The spoon calculating its electron configuration as a “one trick” was hilarious and genius. Loved this video 🫠
Then we make a series of magnets levitating a spoon as a quantum simulator. 😆
It's actually the daily work of chemists. If they want the data of a so far not existing molecule, they synthesize it. It will then solve the exact Schrödinger equation in a couple of picoseconds. Then you can read out the result by NMR or so. The example of the spoon was just a tiny bit hyperbolic.
It knows what it isn't, perhaps. 🤔
And it even does it in it's resting state!
@@tedjohansen1634 It is on the other side of the equals (=) sign.
If people are saying the Ising Model is a specific case that may not be relevant for real world computing, please note that the Ising Model is the basis of the Hopfield model which is a model of neural storage of associative memories in neuroscience. Quantum versions are predicted to have MUCH higher memory capacity than the classical version which only has capacity linear in the number of neural units (magnets).
Exactly again that idea of specialized purpose for memory. It’s why our brain separates cortexes based on function and purpose. Or why we keep multiple drives to separate our work on a computer. Essentially using a more organized memory structure to use memory space as optimally efficiently as possible. Also why we store memories that are associated in physical neuron proximity. Helps them fire off of each other
Neuroscience is still in the quackery stage, barely beyond lobotomy, just finer resolution.
Actually, associative memories modeled after the Hopfield model have a capacity that scales less than linear in the number of units: O(N/log(N)), where N is the number of units, to be exact.
@@SurfinScientist Ah, even worse--thank you!
Solving the Ising model is an NP-complete problem. By definition, every NP-complete problem can be mapped to each other, so solving one solves every other. This is why it so relevant.
I've been waiting a long time for a cogent D-Wave analysis. Thank you.
In classical computing, there is the analog computation, where for example a dial is twisted by one process and the machine spits out a result elsewhere after electrical signals have been manipulated. Not too different from the truly classical manual computer, where someone did calculations by hand, say, looking up as precise as possible a value on a graph. Because of these two historic parallels, i'm inclined to actually call this a calculation.
Yeah, I don't really know a lot about what they're doing, but sounds to me like a "non-general purpose" analog computer
I agree but from another perspective: it takes numerical input (be it a configuration of qubits) and gives a numerical output, so it's a calculation. Simulation can also be, and often uses a lot of, calculations.
@@user-sl6gn1ss8p That is exactly what it seems to be, and truth be told, that can be extremely powerful given a question worth building such a system for
@@user-sl6gn1ss8pQuantum computing does not have to be general purpose in order to be disruptive.
If you and I could make a quantum SHA reverse solver that can almost instantly defeat SHA and can only do that, that alone would make us billionaires.
A device that instantly finds high primes would be another example of a non-general-purpose, yet very disruptive technology that could be achieved this way.
I think this is mostly just semantics.
If it takes an input and produces useful and non-trivial output it is a computation.
I think this is in the same class of computation as the DNA "computer" which can perform very NP search and mapping problems by "annealing" millions or billions of macromolecules in parallel at molecular speed.
If the Everett interpretation is correct, this is pretty much how "Qbits" do it, just faster and parallelized over a chunk of a multiverse.
Are both of these devices computers?
Not if you definition of computer requires a Turing complete machine.
Even in conventional computing, annealing is a very interesting and cool way to solve problems. I've used simulated annealing myself, to optimise patterns in steganographic art. It's actually a lot of fun!
4:15 -- Awwwwww, Albert has a spoon, too!
Good catch 😂
there is no spoon
@@ricsouza5011Albert lives in a deterministic universe. There either is or is not a spoon.😊
It’s also been found that growing slime moulds can solve travelling salesperson type problems as an alternative to simulated annealing in digital computers. But it’s much harder to raise venture capital funds to grow slime moulds, and also much less sexy for clients to look at than all the quantum plumbing. [edit: This was meant to be irony, focused on the fund raising activities of quantum startups. But evidently irony isn’t everyone’s cup of possums juice]
Except slime molds solve the problem by simultaneously traveling through all paths and then start optimizing the most successful paths. It's not really a novel solution that classical computers can't perform. The problem is that it doesn't scale well. Which is why it's a problem in the first place.
@@TheGigaflop Slime mold probably has a better sense of humor than you though.
Yes, but have you seen slime mould gender stuff, I suspect the "more than two sexes" puts right wing investors off.
@@obsidianjane4413it (they?) might have a better sense of humour, but it's (thier?) heckling sucks.
@@obsidianjane4413 Fair enough, just figured on a video about quantum processing, a little clarification about slime mold computation might be appreciated for those that don't know.
Elegant, clear explanation of how D-wave devices aren’t quite “computers”!
The definition of "computer" is an example of the imprecision of human languages. Computer was once a job description, of a person (usually a woman) who ran hundreds, to thousands, of math calculation during her work day. When applied to devices, it covered mechanical, and electronic analog computers, before being applied to digital computers. Some with the calculations "hard set" in their design. Others reconfigurable, or programmable.
I've used analog computers back in college, which is "programmed" by wiring up a circuit. I've also made a living programming microcontrollers, which are "hard coded" most of the time to specific functions. Even though some (the old HC05) have the von Neumann architecture (most are Harvard), will you exclude them as computers, when their app is hard coded ? Language gets political, like that !
maybe its more of an asic than a computer
@@michaelmoorrees3585hard coded on a von neumann architecture is an oxymoron. Unless you are going to try the semantic dance of RAM vs ROM. A rather brief and futile exercise.
The contemporary definition of computer is only sullied by early squabbles (across the pond) of who had the first one (that was Manchester).
It's been a while, but I came to the conclusion that quantum machines cannot execute code...
A requirement for a processor is to have NOT and AND (or gates to the equivalent). Quantum machines cannot do this. They cannot implement the recursive solution of Towers of Hanoi.
What quantum machines *can* do is resolve. The "programming" is then just working out how to model a system in qbit terms.
Define a "computer". Your definition may not be the same as other people's, even if you were to consult a well respected dictionary or textbook. As someone pointed out, once upon a time "computer" was a job description for humans. It's a very imprecise term.
@michaelmoorrees3585 @@another3997 maybe you think you're disagreeing with me? There's a reason I put the word in scare quotation marks. The video is referring to how the word has come to be commonly used these days, which, as you note, isn't the "one true definition."
Millions of years in a normal computer. Trillions of years in a DELL
So D-wave is designing the world’s first analog quantum computers…
what if this is the real path rn and universal quantum computers are 100 years ahead technology 😢
@@vulpritproozewhat if apples fall from the sky are do not grow on trees?
How the magnet and qube and quantum are computing?
It's definitely not the first. Quantum simulators (such as via ultracold atoms) have been around for a while
I think the combination of analog computers with digital is going to be the most powerful option in the near term. Eventually combining digital, analog, and quantum might be feasible - maybe in a few hundred years?
Optimization problems, the kind that D-Wave's approach solves, are some of the most common needs for high performance computing and get used for many things. Nobody has yet proven there is even a quantum advantage for any other useful algorithm. So far, it's basically Shor's algorithm which, outside of academia, is mainly of interest to thieves and national intelligence agencies and I expect both are behind the funding for many of those projects. Given that, I'd say D-Wave's strategy is currently the only real quantum computing. Google can compute arbitrary algorithms on their platform but most, if not all, will be no faster than their classical counterpart and far less efficient.
You know, I'm something of a Spoon myself.
Great video, both for humour and clarity of explanation. Thank you, Dr Hossenfelder.
I'm glad you mentioned the spoon. Before you said it, I was thinking of "10 pebbles in your hand" where you throw them in a box and observe the exact positions down to 15 decimal places and thought "no quantum computer will ever be able to beat its accuracy in calculating/simulating the result of a throw"
"There is no spoon." LOL
“You are the spoon!”
There is no "self" in a quantum computer.
It is still an automaton.
You are the "self".
"Hero gets spell book, all spells and 999 current spell points" Heroes 3😂
Try to understand the ..truth 😮
"Do. Or do not. There is no try." (Yoda)
My understanding of the discussion surrounding D-Wave being / not being a quantum computer has been that it is not so much the simulation vs computation part, but that the way they implemented the system. While a “proper” adiabatic QC tries to avoid an excited state, the D-Wave annealers happily leave the ground state and essentially rely on quantum tunnelling to bring them back down. Hence the debate whether a transition from QC -> non-QC -> QC is realistic. To date, none of the studies have shown that the D-Wave systems have a true quantum advantage, so will be interesting to see what the effect of this new one will be (at least after peer reviewed).
Yes, you are right that the quantum advantage has been questioned. But think of the spoon. The spoon also has a "quantum advantage". It's just not the sort of advantage that people usually discuss in terms of operations needed etc. I have little doubt that if you have a large quantum system like D-wave it will be faster than a conventional computer at *something* -- they just have to find that something. (And then there's the question whether that will be useful for anything, but that's a question you can also ask about the present-day universal quantum computers...)
@@SabineHossenfelder Isn't this "spoon" reference true for all quantum computers?
@@SabineHossenfelder The "classical" quantum computers that can run Shor's algorithm are also good at something, namely running Shor's algorithm. They are as universal as D-Wave's quantum computer is. So much for one-trick ponies.
@@SurfinScientistand now Shor's algorithm was improved and can be done faster
Amazing video yet again! Thank you for sharing purposeful information Sabine💪🏻
Objection. You say "you can't verify the solution because we don't have any computer that solves it".
But... the Ising model is (according to Wikipedia, I wasn't aware of it before, but I trust it on this) equivalent to an NP-complete problem. NP-complete problems are well known for being hard to solve but very easy to check a claimed solution. This is one of their core features. A solution to an NP-complete problem is very easy to check by a simple computer. The difficulty is in coming up with it to begin with. So if the Ising model can be translated into an NP-complete problem, then just take the problem they solved, translate it, translate their claimed solution, then have a checker verify it.
I didn't know about the Ising model and I know very little about the D-Wave quantum computer beyond what you say here, but as a computer scientist who knows about theoretical computer science, this claim that "because a problem is difficult to solve, it's difficult to check the solution" is outright wrong (and kinda basic in complexity theory), and I think it's wrong in this particular case too.
Could you clarify??
NP Hard not NP Complete
@@geordierose3242 irrelevant. Np complete means it's both np hard and np itself. According to Wikipedia, Ising problem without external fields is np complete, and therefore NP, and therefore solutions should be verifiable on polynomial time by a classical computer.
No. NP Hard problems are optimization problems that return real numbers. NP Complete problems are decision problems (yes/no). The former can't be checked (you don't know if you've got a global optimum) but the latter can.
@@geordierose3242 I'm sorry but that's just not the definition of what NP hard means. See en.m.wikipedia.org/wiki/NP-hardness
Now regardless of the name, you're right that is you're trying to find an optimal solution then it can both be harder and more importantly harder to verify. I do get a bit lost with the complexity properties of verifying optimality, but given how it's phrased in the video, and especially the argument "we can't check the solution because we can't find the solution" is just very misleading and potentially incorrect in the context of complexity theory. Maybe in this case it is because checking optimality of the solution is inherently tricky, but it should have been said with more care. And your definition of np hard is just wrong.
I like the humor and references you’ve worked into this one :) thank you as always for your work and insights 🎉
It's always interesting to see a german doing real humor. (who's writing the scripts?)
@@frankman2The residents of Berlin would like to have a word. But you might not understand the word.
I wish I had one of these D-Wave things in grad school. "Millions of years" is what my Ph.D. felt like.
😂 Spoon analogy is at point!
They just built physical model of exact process and measure results.
Don't need quantum things to achieve "billions years worth of simulation" - we already have many lab simulators with sensors.
And now we see the dawn of quantum spoon physics.
The spoon does exist but only when being used.
Been doing Sabine's quantum lessons on Brilliant. That realmy helps to clear up some details.
After a teacher or someone says "I'll come back to that later" I'm waiting for that the entire time, and I cannot focus/concentrate on the things the teacher "wants to say first".
Thanks for looking into this. Quantum “computing” news is heavy on the basic physics side and almost non-existent on the computing side.
Thank you Sabine for making this presentation.
Wave Function Singularities, Quantum Gravity, Unified Field: Frequency Wave Theory
Expanding upon current theories of singularities with a hypothetical frequency wave theory introduces a novel perspective to understand the fabric of the universe and the singularities within it. Singularities, as understood in physics, particularly in the context of black holes and the Big Bang, are points in spacetime where gravitational forces cause matter to have an infinite density and zero volume. This traditional view, primarily rooted in general relativity, encounters limitations when it tries to describe the quantum aspects of singularities.
The introduction of a hypothetical frequency wave theory could aim to bridge the gap between quantum mechanics and general relativity, offering a unified description of singularities. This theory might propose that at the core of singularities, rather than an "infinite" point, there exists a fluctuation of energy or matter that is best described through wave functions. These wave functions could represent the oscillations of fundamental fields or particles at extremely high frequencies, which converge at the singularity.
Core Concepts of the Hypothetical Frequency Wave Theory
1.Wave Function Singularities: Instead of viewing singularities as points, this theory would interpret them as zones where wave functions of particles or fields reach extremely high frequencies. These frequencies could be associated with the energy levels and the quantum states of particles converging at the singularity.
2.Quantum Gravity Interface: The theory could provide a framework for quantum gravity, illustrating how gravitational forces operate at quantum scales. It might suggest that gravity itself can be quantized and its effects are mediated through wave functions that interact with the fabric of spacetime.
3.Unified Field Oscillations: At the heart of singularities, the theory might propose a unified field where all fundamental forces (gravitational, electromagnetic, strong nuclear, and weak nuclear) merge. The oscillations or frequencies of this field could dictate the properties of singularities, offering insights into their behavior and structure.
4.Quantum Foam and Planck Scale Dynamics: At the Planck scale, the theory could introduce the concept of quantum foam, where spacetime itself is subject to fluctuations. These fluctuations would be characterized by wave functions oscillating at the Planck frequency, potentially leading to the creation and annihilation of micro singularities.
5.Information Preservation and Transmission: Addressing one of the biggest puzzles in black hole physics, the theory could offer mechanisms for information preservation and transmission through the boundary conditions of wave functions at the event horizon of black holes.
6.Cosmological Implications: On a cosmological scale, the theory could provide a new understanding of the Big Bang and the evolution of the early universe. It might suggest that the universe itself originated from a singularity where wave functions of the unified field began oscillating, setting the stage for the expansion of spacetime.
Challenges and Future Directions
•Mathematical Formalism: Developing a rigorous mathematical framework to describe frequency wave functions at singularities is crucial. This involves integrating concepts from quantum field theory, general relativity, and possibly new mathematical tools.
•Empirical Evidence: Finding empirical evidence for the theory's predictions would be challenging but essential. This could involve observations of black holes, gravitational waves, and cosmic microwave background radiation.
•Quantum Computing and Simulations: Advanced quantum computing might offer new ways to simulate and explore the implications of this theory, providing insights into the behavior of singularities and the early universe.
This is a part taken from CHAPTER 1 of my recent book:
Wave Function Singularities, Quantum Gravity, Unified Field: Frequency Wave Theory
Available on Amazon here: read.amazon.com/kp/embed?asin=B0CW1GLHD9&preview=newtab&linkCode=kpe&ref_=cm_sw_r_kb_dp_9VVJSQ61YKT844ZSTZA0&tag=aideas07-20
Images were made with AI's imagination
#QuantumPhysics
#QuantumGravity
#UnifiedFieldTheory
#WaveFunction
#Physics
#ScienceTwitter
#Astrophysics
#BlackHoles
#SpaceTime
#Cosmology
#QuantumFoam
#PlanckScale
#Singularity
#SciComm
#EmergingTech
@neiltyson
@carlorovelli
@seanmcarroll
@ProfBrianCox
@lirarandall
@michiokaku
Very insightful Sabine. Mr. X
As ever fascinating. Of course the only bit I really understood was the spoon but I’m so grateful for Sabine taking me places like this. She’s great.
I feel like it would make sense to call this an analog computer, similar to other types of analog computers where you create a circuit that obeys the same equations as the system you're trying to predict.
Thank you for the video.
Fascinating stuff indeed. Thanks, Sabine! 😊
Stay safe there with your family! 🖖😊
Cool. Hopefully I can use this in future to align my chakras, man.
The spoon analogy is not so silly. In the 1960s finite element methods were just beginning to be used in continuum mechanics (stress/strain analysis). Those methods were found to have limitations, as they still do today - especially in vibration analysis. Experimental stress analysis comes to the rescue. Basically, you build a device and test it. If you want to know how a spoon works, it's easiest to just make one.
Not sure about my spoons but my car keys seem to have quantum properties. They vanish from existence and cannot be found, only to reappear in a different location. As an older person, maybe if they could be in more than one place at a time it would be easier for me to get my hands on them. Now if only I could remember where I've left my beer I could enjoy the rest of this video.
Schrodinger's car keys exist in a quantum superposition with any location you aren't very consciously thinking of.
Legit, glad you brought up that it wasnt peer reviewed. Not that this isnt a real finding but it shows that you're a real scientist.
It's not "Computation" on a Quantum "Computer"...It's actually just *Blending*. It's a blender. A blender is analogous to a Quantum "Computer".
You really like that blouse. Looks great on you.
4:15 LMAO! 🤣 "which I totally coincidentally..."
Is the big spoon entangled with Albert's little spoon ?
Sabine: "They say..."
Me: Okay, Sabine is not convinced yet. :D
High quality video. Thank you
it absolutely crazy, i have been trying to learn renormalization yesterday and they start using ising model as example, and now you are also talking ising model
Great advances in computing science, just brilliant.
Its exciting. I think looking at the preprints is cool because we can help peer review. Or at least learn stuff on the bleeding edge.
Seems you have more voice inflection in your presentation now Sabine, and more hand gestures / animation also.
Nice job
This video is really the Ising on the cake.
Super interesting! Question, is the "Salesman problem" suitable for quantum computers or simulations?
The funny thing with science is that it sometimes find applications, after a new "tool" has been made , oder?
You mean the travelling salesman problem? Yes, it's one of the problems you can put on a quantum computer. I believe you can also do it by quantum simulation, at least in principle. However, it's a rather academic problem. There is a class of related problems in logistics that people want to look at with quantum computers.
I think the reason everyone is starting with the Ising model is that it's particularly easy to match onto qubits.
@@SabineHossenfelder The travelling salesman is often used as an example of an NP-complete (and NP-hard) problem because it's easy to explain and less abstract than a lot of other NP-complete problems. However, NP-complete and NP-hard problems come up in practical situations all the time though, and if you've come up with a fast solution to an NP-hard problem, you've unlocked many others. In fact, if you find a polynomial time solution for an *NP-complete* problem, you've solved all of them because they can be converted to one-another.
The problem is that nobody's been able to demonstrate that quantum computers are any better at NP-complete problems, and it's unlikely to be true. At best, it's speculation, but mostly it's marketing. If it were likely to be true, a lot more computer scientists and software engineers would actually care about quantum computers.
Importantly, factoring integers into primes is probably NOT an NP-Hard problem. That's the big one that will break cryptography and everybody gets excited about it.
Getting back to the OP's question, "is the "Salesman problem" suitable for quantum computers or simulations?" The answer is almost certainly "no". Like most problems, you can solve it with a quantum computer, but probably not significantly faster than a conventional computer.
It's also worth noting that >99% of the time you don't even need an exact solution to an NP-Hard problem and a heuristic (fast but inexact solution) will do. So even if quantum computers were good at NP-Hard problems, they'd still be almost useless if they're at all more expensive or difficult to use.
Technically speaking, classical computers might also be able to solve NP-complete problems in polynomial time - it hasn't been proven they can't. If you manage a proof that they can or can't, you will become very famous and win a million dollars because it's a millennium prize, and probably the one with the most practical applications.
You can put it on a quantum computer, just like you can with a conventional computer, but it won't help.
The traveling salesman problem is NP-complete, and quantum computers cannot efficiently solve NP-complete problems.
Quantum computers, at least the fully capable quantum computers that no one is actually building, solve a category of problems called BQP. BQP is a superset of P, the category of problems that can be solved by a classical computer, but a subset of NP, which includes many famous problems including traveling salesman. The most well-known problem in BQP is integer factorization.
Quantum annealing also may have serious AI applications. Model training is an optimization problem, which is exactly what these machines are good at.
This was posted 27 minutes ago, and already 7352 views. RESPECT.
Thanks to you I checked out Brilliant and quite enjoyed it. Getting up to the end of the trial period though and I see that it's €100 a year, that's insane! There are so many good free resources already that I can't justify such a spend. I wouldn't mind if I could pay monthly for that but the monthly cost is something like the equivalent of $25 which is far too much.
Wait--spoons aren't real?!? Then what the heck did I just eat my granola with??? 🤢
It seems funny to me that some people do not think of a specialist machine as a computer. I can't imagine anyone saying that the machines onboard the Apollo rockets we're not computers - yet they could not do general computations of, for example, bookkeeping. Nor could they have played chess because they were hardwired to handle the specific calculations needed fr the mission.
The answer it gave is "42".
incorrect. The overall answer is 1
Perfect
If any type of computation device is a one trick pony that calling it a spoon is pretty darn close. Actually a spoon is useful for many things other than just the normal usage. It sounds like the D-Wave device is configured to solve one problem and only one problem. It might be able to be re-configured but at what cost and what amount of effort? If any project needs an army of technical minded experts to use it, I would say the spoon is more useful.
For Matrix reference 11/10 points.
I did Sabines course. It is really nice but it is short and just the beginnings, finished it in one day. I wish they added more advanced quantum topics in the same style.
"Spoons aren't real!"
F*** I was eating pudding! 😜
Shoulda paused the video
"As everyone knows spoone aren't real". I thought those were *birds*. Maybe I need an update...
Enjoyed the report. Sabine is the best at illuminating progress in physics.
Like DWave, A company in San Diego named Mem Computing has been doing quantum simulations and has shown incredibly abbreviated compute times compared to the linear digital computer.
I believe there is a standards body that develops standard tests so that these companies that do q simulations can compete. For example there are tests for optimization and for finding primes.
I am not an expert in this area but will try to find out more to add to this list of comments.
In the paper at conclusions they claim: "We simulated square, cubic, diamond, and biclique topologies that are relevant to applications in materials science and AI, and are amenable to scaling analysis through the area law of entanglement and universal quantum critical scaling. More challenging and irregular topologies-for example those corresponding to deep neural networks".
So if real world application exists and their results are applicable then we can test the quality of those results on the applications.
Until this is done we should stay in a quantum state between amazed and anoyed.
Millions of years in just a few seconds. Love that! Thank you for the work you do....
✌️🤟🖖
If it accepts input and arrives at a correct answer, even a correct estimation, then it is, by definition, a computer.
Sabine, I'd like to buy this spoon.
How do you crank out so many good videos so often? Sorcery.
She´s very busy, but she works them out with a little team.
Hello Sabine, I do my best to follow your channel, but often get lost in the labyrinth. Imagine yourself as a teacher of a class of 7/8 year olds, and me at the back sticking my hand up and asking, "Ms Hossenfelder, Ms Hossenfelder, it is not how that concerns me the most, but it is WHY?"
Sabine’s spoon analogy and ability to cut to the critical issue is very Feynman like… very very few people are this smart, practical and creative
Well , This is fantastically surreal for me. Thanks Sabine! 🤓🙏✨
I was making a molecule in Caligary true space Legacy Edition, it slowed down horribly at 200 meta balls, The project requires nanobots to go in there and create a nano lithography machine, to make progressively smaller nano machine's. Then you can hold a tesla giga fab in the palm of your hand. Send them to mars to produce 32oz diamond cut acrylic tumblers, just keep stacking the cups to make a little shelter for more rare and interesting materials, color the glass so you are amazed by the artifacts that have been actively arranged and collated.
Wow, this is incredible! I look forward to the day where we all have such a machine in our homes that can compute a very, very, very, very specific thing and nothing else.
"one trick spoons" - that had me crying with laughter
Please, Explain time.
Why do we say time passes slower or faster depending on your location and/or movement within the universe? Could it be it's just the measurement of time is different?
Scenario:
Calculate a future spot in earth's movement through the solar system and pinpoint it. Now pick three spots within the universe to observe the earth's movement until it reaches the calculated spot.
Observation location 1 is NIST laboratory in Boulder Colorado.
Observation location 2 is our moon.
Observation location 3 is an orbit around the sun at near light speed.
Now have all 3 observation locations start recording the passing of time at exactly the same time. All three observation loctions would stop the time recording when the earth landed on it's previously calculated spot.
Would it be illogical to say, the physical amount of time passed for each observation location would be the same? Could it be that the recording device used to record the passing of time showed differently, but the physical amount of time would be the same for all 3 observation locations? Does it seem reasonable that what we need is a way to record time with a device that can account for its location and movement within the universe, so that the recording of time passing is equal on all accounts?
The earth will move through the solar system and reach its calculated spot in the future in its due time. It won't be early and it won't be late. It will be right on time according to the calculation. Why would we say the passing of time is different for the 3 observation locations?
I'm really curious if the phrase "There is no spoon" would have got Sabine into some sort of copyright thing or if she just didn't feel like taking the time to google the exact quote and decided to paraphrase. One of life's little mysteries, I guess. FWIW.
Quantum computing is like fusion power. Great in theory.
4:35 Is this a Matrix reference (first movie)? Where the kid explains that the spoon is not real and does not bend, but you bend yourself.
The (there is no spoon) reference was brilliant. hihi
nice! ising model was my master project at university in late 90th, with a 25x25x25 grid. It's cool to see the problem is still there 😅! go Ising, crash the new gen computers!
Artificial Cooling was a thing in OpsReseach circles in the 90s as a way to find minima.
Finally we can solve the Traveling Salesman Problem for the US😃
Great channel, keep it up Sabine
Maybe something like "quantum simulators" would be a good differentiating name for these things?
This may be a great way to speed up the otherwise glacial supercomputer optimisation of the magnet configuration in designing stellarators
still trusting these guys more than IBM
Can it find the question for 42?
Can it find the key to 1FeexV6bAHb8ybZjqQMjJrcCrHGW9sb6uF ?
Sabine, your spoon landed on my desk. You can pick it up between now and the year 10^100.
😂
You must walk before you can run, but I decided to become impressed by a quantum “thing” when it decrypts a now unbreakable message within a minute or a second or so, or when that thing finds the provable best move at every position in a game of go.
It is a kind of analog computer, such computers were common long time ago until they were superseded by digital computers. Analog machines were very effective for solving differential equations. The difficulty with this quantum machine lies in finding the proper analog problem to mimic the problem you are trying to solve. I would love to see them solve the Travelling Salesman Problem as a demonstration of a useful application.
Drawer with 10 spoons = quantum computer that can perfectly stimulate up to 10 spoons in any physical disposition.
Hey Sabine, the idea of thermodynamic computing pop out on my youtube recommendation. I want to hear more about that from the physics point of view.
Another Canadian VC money scam like D-Wave.
'Simulation' kind of implies 'computation', but hair-splitting aside, I think that (specialized) computation via setting up initial configuration of the system, and then letting that system settle into some kind of ground state is a really interesting approach.
I may be wrong (and I'm happy if I'm right 50% of the time, especially about things that I have only glancing understanding of), but do I think that I can already see how such approach could be used for extremely fast factorization of public keys (used in public-key cryptography).
What one would need is a (dynamically resizable) 2-dimensional grid occupied by fermions (only fermions will do, because of Pauli's exclusion principle), and with that grid occupying vertical position in a gravitational field.
Start with the grid size of NxM (N horizontal positions [along x axis], and M vertical positions [along y axis], where N
Private Eye has a regular 'Me and My Spoon' feature. They should interview Sabine.
From the picture of D-Wave devices there are at least two of them. To check the accuracy of one you can run the same task on the other and compare the results.
It's kinda like analog computers. Geared towards very specific tasks and provide fuzzy results that are hard to verify, but sort of match the expectations.
Yes, indeed! I was about to say more about this, but then I felt it was somewhat tangential.
3:06 The more you know!⭐️
Quantum biology is the most fascinating field to me that involves quantum properties.
isnt it
👍👍👍 Good job!! Congratulations. 🏆🎊🎈
Its like Jannings black box out of "Sneakers" the more complex it gets the better it works. 👌
It is like saying that an air tunnel used to test airplane wing designs is a computer. It is just a physical system very similar to the targeted one, so we can extrapolate some results with large accuracy.
If something gives you answers to one or more questions, based on input data, it has been computed. A wind tunnel simply moves "air" in a specific direction, at a speed which may or may not be controllable. It is analogous to reading parameter data from a storage device or keyboard. The tunnel itself doesn't recognise the presence of a wing, and it doesn't react its presence. That is different to the simulation within a quantum computer.
@@another3997 I meant the wind tunnel plus the particula wings, and the computation would be the lift, etc. the qbits just simulate quantum interactions, and the configurations need to change to emulate different systems.
"a physical system very similar to the targeted one"