This is at least the second time I've heard a "Someone found a conventional computing way to do this thing faster than a quantum computer". Thus far, the biggest benefits from quantum computing have been encouraging us to learn how to make normal computers do stuff faster. Well, it's something at least.
This is about understanding math rather than quantum physics. If one uses quantum computer to solve a problem traditional computer can handle, this is a logical conclusion. It is also possible it was a hoax, to obtain money for the research.
@@distiking Siphoning money that could be actually be put to good use elsewhere is the crowning achievement of quantum computing. It is otherwise a useless gimmick.
The only reason quantum computers are deemed useful for financial analysis is because they're marketed as being useful for financial analysis. They're marketed that way because they're marketed toward investors.
@@eddyr1041 This hasn't been demonstrated. The cycle goes like this: 1) Quantum computer huckster says "quantum computers will be better at X." 2) Somebody shows a barely functional or purely theoretical version of X on a quantum computer, or more likely, an idealized model of an error-free quantum computer. 3) Somebody else shows that conventional computers can do X just fine.
QM classicalized in 2010. Juliana Mortenson website Forgotten Physics uncovers the hidden variables and constants and the bad math of Wien, Schrodinger, Heisenberg, Einstein, Debroglie,Planck, Bohr etc.
THEORY OF EVERYTHING IDEA: Revised TOE: 1/24/2024a: TOE Idea: Short version: (currently dependent upon the results of my gravity test): The 'gem' photon is the eternally existent energy unit of this universe. The strong and weak nuclear forces are derivatives of the electromagnetic ('em') interactions between quarks and electrons. The nucleus is a magnetic field boundary. 'Gravity' is a part of electromagnetic radiation, gravity acting 90 degrees to the 'em' modalities, which of course act 90 degrees to each other. 'Gravity' is not matter warping the fabric of spacetime, 'gravity' is a part of spacetime that helps to make up matter. The gravity and 'em' modalities of matter interact with the gravity and 'em' modalities of spacetime and the gravity and 'em' modalities of spacetime interact with the gravity and 'em' modalities of matter. TOE Idea: Longer version: (currently dependent upon the results of my gravity test): THE SETUP: 1. Modern science currently recognizes four forces of nature: The strong nuclear force, the weak nuclear force, gravity, and electromagnetism. 2. In school we are taught that with magnetism, opposite polarities attract and like polarities repel. But inside the arc of a large horseshoe magnet it's the other way around, like polarities attract and opposite polarities repel. (I have proved this to myself with magnets and anybody with a large horseshoe magnet and two smaller bar magnets can easily prove this to yourself too. It occurs at the outer end of the inner arc of the horseshoe magnet.). 3. Charged particles have an associated magnetic field with them. 4. Quarks, protons and electrons are charged particles and have their associated magnetic fields with them. 5. Photons also have both an electric and a magnetic component to them. FOUR FORCES OF NATURE DOWN INTO TWO: 6. When an electron is in close proximity to the nucleus, it would basically generate a 360 degree spherical magnetic field. 7. Like charged protons would stick together inside of this magnetic field, while simultaneously repelling opposite charged electrons inside this magnetic field, while simultaneously attracting the opposite charged electrons across the inner portion of the electron's moving magnetic field. 8. There are probably no such thing as "gluons" in actual reality. 9. The strong nuclear force and the weak nuclear force are probably derivatives of the electro-magnetic field interactions between quarks and electrons. In the case of the alpha particle (Helium nucleus), the electro-magnetic field interactions between the quarks themselves are what keeps them together in that specific structural format. 10. The interactions between the quarks EM forces are how and why protons and neutrons formulate as well as how and why protons and neutrons stay inside of the nucleus and do not just pass through as neutrinos do. (The neutrino being a substance with a very high gravitational modality with very low 'em' modalities.) 11. The nucleus is probably an electro-magnetic field boundary. THE GEM FORCE INTERACTIONS AND QUANTA: 12. At this time, I personally believe that what is called 'gravity' is a part of electromagnetic radiation, gravity acting 90 degrees to the 'em' modalities, which of course act 90 degrees to each other. 'Gravity' is the force which allows a photon to travel across the vast universe without that swirling photon being flung apart or ripped apart by other photons and/or matter interactions. Gravity being a part of the 'em' photon could also possibly be how numbers exist in this existence for math to do what math does in this existence (the internal oscillations of the 3 different parts of the 'gem' photon, each modality having a maximum in one direction, a neutral, and a maximum in the other direction.) 'Gravity' is not matter warping the fabric of spacetime, 'gravity' is a part of spacetime that helps to make up matter. The gravity and 'em' modalities of matter interact with the gravity and 'em' modalities of spacetime and the gravity and 'em' modalities of spacetime interact with the gravity and 'em' modalities of matter. 13. I also believe that the 'gem' photon is the energy unit in this universe that makes up everything else in this universe, including eternally existent space and time. ('Space' being eternally existent energy itself, the eternally existent 'gem' photon, 'Time' being the eternally existent flow of energy, 'Space Time' being eternally existent energy and it's eternally existent flow). 14. When these vibrating 'gem' photons interact with other vibrating 'gem' photons, they tangle together and can interlock at times. Various shapes (strings, spheres, whatever) might be formed, which then create sub-atomic material, atoms, molecules, and everything in existence in this universe. 15. When the energy units unite and interlock together they would tend to stabilize and vibrate. 16. I believe there is probably a Photonic Theory Of The Atomic Structure. 17. Everything is basically "light" (photons) in a universe entirely filled with "light" (photons). THE MAGNETIC FORCE SPECIFICALLY: 18. When the electron with it's associated magnetic field goes around the proton with it's associated magnetic field, internal and external energy oscillations are set up. 19. When more than one atom is involved, and these energy frequencies align, they add together, specifically the magnetic field frequency. 20. I currently believe that this is where a line of flux originates from, aligned magnetic field frequencies. NOTES: 21. The Earth can be looked at as being a massive singular interacting photon with it's magnetic field, electrical surface field, and gravity, all three photonic forces all being 90 degrees from each other. 22. The flat spiral galaxy can be looked at as being a massive singular interacting photon with it's magnetic fields on each side of the plane of matter, the electrical field along the plane of matter, and gravity being directed towards the galactic center's black hole where the gravitational forces would meet, all three photonic forces all being 90 degrees from each other. 23. As below in the singularity, as above in the galaxy and probably universe as well. 24. I believe there are only two forces of nature, Gravity and EM, (GEM). Due to the stability of the GEM this is also why the forces of nature haven't evolved by now. 25. 'God' does not actually exist except for as a concept alone. The singular big bang theory is a fairy tale for various reasons. The CMBR from the supposed 'bang' should be long gone by now and should not even be able to be seen by us. Red Shift observations have a more 'normal' already known physics explanation, no dark energy nor dark matter needed. The universe always existed in some form and never had a beginning and will most probably never have an end. Galaxies collapse in upon themselves, 'bang', eventually generating new galaxies. Galaxies and 'life' just come and go in this eternally existent existence. DISCLAIMER: 26. As I as well as all of humanity truly do not know what we do not know, the above certainly could be wrong. It would have to be proved or disproved to know for more certainty. Currently, my gravity test has to be accomplished to prove or disprove that portion of the TOE idea. But, if not this way, then what exactly is the TOE of this existence?
Maybe one of the biggest advantages of quantum computing is simply that it drives us to figure out new ways to optimize classical algorithms to do things we always thought we needed quantum computers for.
@@benjaminblack91fast quantum chemistry calculations would have practical application in people's daily lives, if successful for say, drug development. Obviously it's a big 'if' though.
The reality distortion field is not accounted for in the Standard Model of physics, but it permeates almost everywhere in Silicon Valley. The RDF attracts money, and a lot of it, and somehow makes it disappear from normal 4D space.
You need to understand that IBM employs "breakthrough announcements" as a marketing strategy. It enforces the belief that they are using advanced technology in their products. It's the same reason that Ford, Porsche, and Ferrari enter Grand Prix events.
Well, currently of those three only Ferrari take part in F1. If all the QC 'breakthroughs' turn out like this it will just make them look foolish but they will still have their investors cash.
I have never thought that achievements of a racing team somehow translates to technology used in mass production cars. It is only has merit in the production car classes when they have minimum modifications. People who familiar with tech just a bit know that the gap between experimental tech and mass produced tech is very big.
Last year, I read an IEEE article by IBM that said they had created a method of dealing with the inaccuracies of quantum computers. The problem was that it required a tremendous amount of supercomputer time to perform their algorithm. And when it came down to it, it was quicker and cheaper to run simulations on a conventional computer than on a quantum computer that had its data processed by their error correction algorithm. This was hailed as a great advance in quantum computing.
I wonder if there will be an ESTIMATION technique that is discovered which can be used as a cheap, high-peed way to bypass the supercomputing. It won’t be perfect, but “good enough” to enable progress fwd.
Well: 1) Publication is a must 2) A publication always finds a positive result 3) Any knowledge gained is positive, so if we have proven for a subset of conditions that a procedure is completely unpractical, the knowledge must be spread via publication. It can be argued if publication is the best way to spread the fools error's, but rule number 1 says 'publication is a must'.
@@imeakdo7 Sure. But people keep going on about quantum computing like it is useful now. It isn't. In a way, it's like graphene. We've all been told that graphene is going to revolutionize the world. But decades after its discovery, they've managed to find a few practical applications for the stuff, but the graphene revolution hasn't happened. I'm not suggesting that people give up on quantum computing any more than I'm suggesting that they give up on graphene. I'm just suggesting that they allow realism to tinge their optimism.
I really hate it when people see or hear news about something which will "revolutionize science" like the line in Saudi Arabia or the hyperloop and say "it will revolutionize urbanistics" or "it will revolutionize transport" by giving the same arguments made in the news article without knowing a single thing about what happens there. The same goes for quantum computers.
Here, I'm going to explain something you should know but you don't. Every bit, and I mean EVERY bit, of "popular press" or "reliable sources" is either propaganda for a product, propaganda for a social movement, or propaganda for a government. It's obvious it is.
@@fuzzywzhei'd say in many cases, media it isn't propaganda for a product, but the product itself. Commercial media makes up stories that sell to their audience.
@@fuzzywzheand it really depends what you include as reliable sources. A good bit of information provided is actually verifyable, but few people put in the effort of doing so, and even true information can be used to support false claims
@@roger5059 It's a waste of time filtering through the garbage of our "news" to find facts. You CAN find them, but they are always incomplete at best. We created the Internet not to compliment television and radio and newspapers, but to eliminate them. Media is nothing but a commercial from everything to soap to social attitudes to war. That's why it has to be centrally controlled, to keep "on message".
It may be the case we're trying to build a 22nd century computer in the 21st century. Apparently, people imagined 20th century computers hundreds of years ago but simply did not have the technology to build one.
Like the Analytical Engine, which is a fully-fledged programmable (punched cards) Turing-complete general-purpose computer with real programs (many written by Ada Lovelace). Its inventor, Charles Babbage, described it in 1837, more than 100 years before Zuse, but only built parts of it. There are interesting discussions available from this time, e.g. CISC vs. RISC or widening arithmetics.
@@dmitripogosian5084 why fortunately? Because it would have worked, but the information age would have been bad for mankind? Or because it would have been too difficult? They showed a few years back that the tolerances of that time would have been good enough to build it.
Your videos remind me of watching Scientific American and NOVA as a kid, there was always this magical feeling to seeing what's new in science and how it might change the world. Now I get to enjoy this same feeling as an adult and I'm incredibly grateful for that.
I'm assuming they meant 10.000 years on a similar sized processor (128 bits) which of course isn't relevant since no modern computer processor is that incredibly tiny.
@@stoferb876Google on Oct 23 2019 "Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output."
The spookiest thing; Is that at some level, Either everything is dependent upon circular logic... Or it just, "Works", Somehow? Its weird to think about how logic exists for no reason, And nothing existed before reality... Not even emptiness itself! :D @@ricardoorellana3350
The Ising model is a model of a model. Is the fundamental stone of statistic physics development and inspire many models of physical phenomena. In a sense, it is the most sucessful testbed in science. So, Sabine, Ising model deserves more credit! :)
Sounds bit like: Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation and eventually build a structure which has no relation to reality. -You know who.
@@baomao7243you'd be wrong, statistical physics is one of the most successful parts of physics and the ising model (like much of stat phys) can be applied into many different problems with great explanatory power
@@baomao7243 i think its a dimensional reduceed covariance or correspondence of a model, with the right choice of symetry , its great. like 2d topology theory explainig 3d observation. or captureing a smooth manifold or a set of qunatized vortices as spin numbers. it in set theorety , not modelling every detail but just the charcateristics.
there seem to be only one task where they would be better it's prime factorisation. Without that application there would be less hype in quantuum computers imho.
The outrageous claims made by some of the so-called quantum computing players speaks volumes about science in general. It's less about discovery and passion for breaking new frontiers and more about pulling the wool over the eyes of investors. I'm sure the "ours is bigger than yours" competitions will continue though, like it is with AI and advances.
Once upon a time it was to discover new knowledge for the sake of new knowledge, and if it ended up useful then so be it. But the feudal patrons of old, who funded science out of a desire to seem generous to the arts, were replaced by the money makers who instead leveraged funding into projects and fields that would make them money. The explosion of engineers, applied scientists who don't care about new knowledge for knowledge's sake, have been required to keep the science pimping machine humming. Woe be upon you if you study something "useless" as a scientist, as you will have to beg and scrape for even small amounts of money.
The future of quantum computing is beginning to sound like that of nucleat fusion. With so many little roadblocks as well as major technological challenges that stand in the way I can't help be feel we will start to hear, _"It'll be ready in five more years, and then five more years after that, and then in five more years we'll just need five more years until all the kinks are worked out..."_
Usage of the transverse Ising model? The classical counterpart is NP complete and lays the foundation not only for all these optimization problems (finance, traffic optimization etc) but can be used for protein folding as well. For the error stuff - did you look at STA methods? Not perfect but quite interesting.
@@ucantSQ Not password. CIpher. Besides Russian favorite one, one time pad cipher, because that's impossible to crack but shh, quiet with that inconvenient detail :
Sorry, but the Ising model is not a purely academic curiosity. My professor said it's the "drosophila" or "harmonic oscillator" of statistical physics and I agree, since it's exactly solvable in some cases (serves as a toy model/ case study for new methods) and has plenty of applications, because there are mappings from different systems onto it (e.g. deep neural networks, magnetism, polymers or gases/ liquids). This particular quantum version can be mapped to a classical version where we find the many applications.
Sabine flip flops on this, in a later video she flipped and was more optimistic now she's flopped again. Not quite the Schroedinger effect but we're getting there.
Sabine, what about Andrew Childs et. al's paper "Exponential algorithmic speedup by a quantum walk"? Why does no one ever talk about this paper, it clearly demonstrates quantum supremacy in a strong sense (they prove that no classical algorithm could do better). It does unfortunately fall into the category of "completely useless algorithm that quantum computers are provably better than classical algorithms at", but it is nonetheless a "supreme" quantum algorithm, so I thought it would've gotten more attention (I put it right below Shor's algorithm, since Shor's algorithm is slightly more useful but it's not going to make my internet faster or anything like that).
Very interesting video. I'm only a CS BSc, but do we really practically need a quantum computer to be able to tell whether or not there will be a quantum time/space speedup in general for algorithmic problems in a certain complexity class? The theoretical science says enough right? As far as we know at least, because we still can not proof if P = NP or P != NP. But to proceed with this, we need money, and for money this needs to be marketed as bringing some kind of value/revolution, while we really have no clue if this is really going to be the case, but I might have missed some new found studies in the last 10-15 years.
Quantum computers is right up there with AI and cryptocurrency in being hyped technologies used to keep the investments flowing into R&D so tech companies can keep building those advanced and expensive computers. The demand driven by smartphones and streaming services has plateaued.
You know what, conventional computers are made by quantum mechanic technologies and that's why it's this powerful. Digital signals in very high transistor density is similar to quantum mechanics with 2 superpositions input for each qubits. Of course, conventional computers try to prevent quantum tunneling, while quantum computers try to use it. If we were to concern about the calculation speed per bits/qubits, qubits is 100,000x faster (if the input response can keep up to populate them all the time). But, increasing number of transistors by adding chip communications for parallel processing by that amount costs 100x less than adding 1 more qubits, it also cost less energy overall in the same active CPU/QPU time. So... what's the hope of quantum computing? Just like GPU involved from video cards, IPU (instantaneous processing unit) could potentially emerge from high temperature quantum computers with compact super cooler. Like for our general purpose uses, only 6 qubits is enough for accelerating lots of things.
This story sounds similar to that of D-Wave in 2013 (which I was somewhat involved in on the classical side). News reports at the time claimed that D-Wave's 439 (later 512) qubit adiabatic quantum computer was 10^8 times faster than classical algorithms for some task and 3600 times faster for another. The only trouble was they were comparing with terrible classical algorithms. It turned out that using a good classical heuristic, similar to belief propagation, that exploits the "thinness" of the underlying graph easily outperformed D-Wave's results. I haven't understood the Tindall et al paper properly, but it looks on a surface reading that something similar might be happening here with the IBM Eagle, in that it is based on a heavy hexagon lattice which, like the D-Wave graph of 2013, is only sparsely connected and has treelike correlation. The paper mentions that their method of belief propagation works well in such cases, which sounds similar to what I and others were finding in 2013. (If anyone is interested in revisiting the D-Wave story, see Scott Aaronson's blogs numbered 1400, 2555 and 3192.)
As someone who works with one of the best quantum information scientists of my country let me tell you something I said and he agreed: "Quantum computing can either swim or sink but It will do so in this decade" quantum comunication will probably still be a thing because It is simpler to implement and It offers real advantages to security and criptography, and we all know how much armies love thosse 2 things. But a computer to run quantum algorithms Is still totaly up for debate If It will ever come to be.
@@btf_flotsam478 I mean I never read the article, but I know that China transmited quantum information by satelite to Viena. And even my country of Brazil was able to do experiments on the Rio De Janeiro bay area of transmiting quantum Information across the bay. Again never read the articles so I dont know what was the sucess rate. Altought just to be claer my teatcher just agreed with the part about sink or swin In this decade, this part of quantum criptography was all my own tought.
@@nicolasreinaldet732 I am just stating my observations. I remember reading about quantum cryptography yonks ago and suspect the reason I haven't heard more of it is because of those technical hitches that are why these things completely vanish.
The things that get hyped up as world-changing always disappoint, while the actual world-changing technology is incredibly boring. The Gauge Block set is one of the most important inventions ever made in all of history, but most people don't even know they exist and the inventor K.E Johansson almost went bankrupt trying to sell them.
I agree on most part. But AI will take away most our jobs. It's beyond powerful. It has an IQ on Einstein Level and can replace even lawyers (also us physicists and mathematicians)
the greatest revolution of the entire history of science is very likely just the modern algebra language. it went nearly unnoticed, it happenned very very slowly, starting with arab numerals, getting enriched with operators, new notations, and so on... keep in mind there was a time when "the squared distance between two points equals the sum of squares of the difference in each coordinate between the points" was the entire pythagorean theorem, it was the only way to write down its formula.
1) It's kind of obvious that a highly optimized conventional algorithm have all chance to be more efficient that a crude version of a quantum algorithm that nobody knows how to optimize. Quantum algorithmic is confusing and just at its beginning. 2) Even if we assume we have a problem that we know for a fact to be in a complexity class "better" than a classical complexity class, a small instance of the problem can still be harder in the former class (Just think of linear time complexity vs quadratic time with very high constant...... this complexity class hierarchy is only valid asymptotically) . Today quantum computing only "works" for small instance of problems (because of the lack of stable qbits.)
well mabye it willl be practical on the moon.. helium 3, low pressure is cheap, cooling is cheap, an leaks or explosion, can just be mabye cast into deep space.. the net energy gain is related to cooling costs and vaccums and such. quantum computers cant worko the moon, superconducting ones might be easy an dcooling isnt as hard if there is a big condutor, and the heat can be reused, not just blown by a fan.
Fusion research has been at it long enough now to actually demonstrate incremental improvement. But yes, both suffer from the same fundamental problem of human impatience. The problem of a society and economic system that want a 3-10 year return on investment in spite of being made up of a species that took THREE HUNDRED MILLENIA to discover writing. It's a sad reality of our species that we're only motivated by short term goals, admitting "this will take 100 years of continuous R&D" would basically mean humanity would shelve it and probably follow the dinosaurs into death-by-spacerock waiting for the "right time" to start working on it.
Except we know that fusion reactors are at least possible to create because nature has lots of examples. We still don't know if a real practical quantum computer can be ever done. A few qubits for a few microseconds is a cool toy, not a practical machine.
Well proving you could perform that specific task faster with a system we have much more experience with, after the initial test isn't surprising. Bit of an unfair comparison. Even if Quantum computing would only be useful for niche applications we can learn a lot by experimenting with it.
Not surprised. Every time for, i'm not even sure how many years now, when someone claims they've reached the stars in regards to quantum computing, and i go and actually bother to read up on the claim fully, every time i find it to be extreme levels of hype and completely unrealistic. I've been pessimistic about the idea ever since first time i read up on it, simply because there's just no good way to make it work in any CONVENIENT way. It's possible to achieve various successes, sure, but they're in laboratory conditions and simply cannot be transferred over to a functional workplace environment. Not even as mainframes of the old "house-sized computers" era. And far too often, they essentially do not even work properly. So, seems like i get to start throwing around "i told you so" to various people. Again.
Quantum computing is one of those very hard technical goals. Is it something that should be pursued? I believe so. Is it something that we will realize in the near term? Probably not. The reason I think this is important research is the understanding of how things work at a quantum level is actually quite important to our societal future. As we continue to scale sizes down, we do become more dependent on handling quantum issues. We might not ever get to a quantum computer that is worth the cost, though I believe the major issues with cost will be over come, but what we learn along the way is priceless.
I think that it is not emphasized enough how much, in a way, the interest in quantum computing goes way beyond solving logistic problems, factorizing big numbers and inverting matrices. Quantum compunting is the ability to control and mesure systems at the most foundamental level know. Its a (sophisticated) version of learning how to move rocks, manipulate matter, syntesize new stuff with chemical reactions and so on, there is just no way it will not be useful. We already achieved a level of control at the quantum level unthinkable in the 90's on many phisical platforms, neutral atoms, Ions, supercoducting quibits, even Nuclear resonace. Optical twezers, laser cooling and other tequniques were at least partially developed for this porpuses and they have all kind of applications in low energy physics. We should also not forget that without the ability of producing entagled states we would still not know if hidden variable where a thing, and without squeezed states no gravitational waves can be detected. Not to mention that we can understand if decoherence is intrinsic in phisical interaction (whether becouse of gravity or no) only by triyng to build larger and larger coherent objects, which is kind of what Q.computing is. It hasn t anything to do with nuclear fusion, even though the continuus (very annoying ) claims of IBM ang google that by 10 years they will have a zillion qubit smartphone make it sound the same way.
I found it worthwhile learning how to program quantum computers. It helped me a) understand quantum mechanics and b) demonstrate to myself that they are waaay off being useful just yet
Quantum Computing is ultimately about a faster ALU, but most of the compute task's time is spent elsewhere in staging the relevant data not the actual arithmetic manipulation in ALU. Even If the time delay through the John Von Neumann Bottleneck at ALU was reduced to zero, almost all of the total runtime required for typical computation-intensive tasks would remain. Centrally triggered, distributed computing is needed.
Yeah, and if he returned, he'd be appalled to see we're still architected as if high-bandwidth networks and essentially free compute units didn't exist. @@retiredbore378
Thank you for the video! I completely agree with the points made about the increasing prevalence of anger in the US. Despite its problems, growing up in the US, I didn't perceive as much anger as I do today, I earn over $370000 in investment.
Everyone needs a different stream of income, unfortunately having a job doesn't mean security due to the high rate of tax, one needs to move ahead of their expectations
I used to work 3 jobs, full time at Walmart, a server at night and did lyft on the weekends still only make ends meet, investing with Larry Marshall gave me the opportunity that allowed me to work away from the rest race
Yes. This is why you should always trust it. Case in point... the first 15 seconds of this video explain to you why you SHOULD question everything. Everything is science. Everything is "right". Until... it suddenly isn't.
what did einstein mean then with spooky action at a distance? I always thought he meant entanglement and google and chatgpt also say so, so i cant find the real meaning
Sabine please answer. From en.wikipedia.org/wiki/Einstein%E2%80%93Podolsky%E2%80%93Rosen_paradox "The thought experiment involves a pair of particles prepared in what would later become known as an entangled state. "
On the subject of Einstein, it is time to give up the springy Einstein prop on your otherwise excellent science videos. Einstein was the greatest and most transcendent of many great physicists. He doesn't need to parodied by a springy prop, and I don't think any of us should try to second guess what Einstein would say or think almost 70 years after his passing. The prop might get you more youtube views, so perhaps you are doomed to continue this gimmick.
I dont think people realize just how important the advancement to the speed of material science is. The ammount of undiscovered potential chemicals, alloys, and other materials is frankly staggering. The amount of stuff you could do if you had the right material is crazy.
Trotterization is not a specification of the model, that is a specific way of solving the problem. In fact, it's quite general. One can use it for any given Hamiltonian to do the quantum phase estimation experiment.
The last time i investigated quantum computing i was told that you need a system next to the QC which is almost as massive as the calculation you are trying to do using the qbits.. meaning, you don't actually need the qbits in the first place.
was told, "Quantum mechanics is probably one of the best understood branches of all of physics.", this morning by someone that couldn't tell me how long quantum links last. this is an entertaining field of physics.
If I'm not mistaken, the first quantum computer was brought out by Canadian quantum computing research company D-Wave barely more than a mere decade ago, yet the utility of quantum computing technology is brought into question because it doesn't already meet expectations in always outperforming classic computer systems? 🤔🤨
I suspect that quantum computers will turn-out to be even less useful because of Digital MemComputing (DMM) - which is a new type of computing system that may provide strong competition for certain problems (such as factorisation). At the moment they are flying under the radar, but they seem to be making great strides towards building practical systems, so they will surely gain attention sooner or later. Most recently they I read they managed to implement a DMM using an off-the-shelf FPGA, which could be a game-changer. I only came across DMM recently, and am still absorbing how it works, but basically it's using memresistors (or circuits carefully configured to behave like them, until memresistors can be cheaply fabricated) to implement digital logical circuits that that can (potentially!) solve sufficiently large problems thousands or even millions of times faster than traditional CPUs. How does it manage such big speed-ups? Their high-level explanation is a bit too opaque for my liking, but from what I can tell all components work in parallel towards finding the solution (effectively testing many solutions in parallel), whereas a traditional CPU can only test one solution at a time. So the larger a DMM can be made, the bigger the speed-up it will get. e.g. They say they can solve NP-complete problems in polynomial time & resources, where-as traditional computers would take exponential time. Another way of looking at it is they run digital logical circuits both forwards (as is traditional) & backwards at the same time (which is initially mind bending), until it finds a solution that works for all logic circuit elements at once - which feels a bit like how quantum computers also work.
Classical IBM marketing, see past years when IBM has claimed early leadership in Cloud Computing, et al. But that's ok, MSFT, GOOG, ... all have "eager Marketing" efforts. QC is perhaps going to be a new paradigm of computing, next to analog and digital computing. Lots of challenges to go but a 'killer app' for QC will be breaking cybersecurity as we know it today.
@@phoenixrising164 No it's not, the term 'Cloud' came when Ray Ozzie @ MSFT proposed it as a new strategy and some folks (incl. @ IBM) all went "..what! That's time-sharing again!" I was @ MSFT then and I had a front-row seat when it all happened.
I wish i knew what you are talking about, but i'm happy someone here is more involved in quantum computing than the average viewer, whose knowledge about quantum physics comes from short youtube videos, such as myself :). I hold my thumbs for all those who have the brains and resources to explore this field. I hope one day we will have reliable quantum machines. The reason i gave up on quantum is the expensive equipment that is necessary to experiment and get the "feel", what it actually does. Intuition is useless to explore it, and as a hobbyist, i was left with math and whatever others have discovered that could be a dead end, and still there are still many questions along the way no book can answer. My position is that this field is reserved for those who dedicated their lives/careers.
They are soo desprate for positive quantum computer news to sustain funding that they are basically bussy trying to do just that. And even that is not succeeding. I think quantum computing will never be viable. The more qbits. the more noise. The more noise the more q-bits you need to compensate for the noise. Its exponential. For quantum computers to be useful . they have to consist of enough q-bits. It might very well be the case that we will never be able to get a useful quantum computer. Unless we somehow learn more about physics and find some way to reduce noise that we have not been able to previously. Which doesn't look likely at this point. I don't even think we will have economical fusion reactors soon or at all. But i deem fusion reactors more viable than quantum computers lol
Entanglement doesn't exist in conventional computers? I'll have to disagree with you on this one, Sabine. Maybe you haven't seen my parents cable management strategy for their computer from the early 2000's. There is literally no other word to describe that thing but entangled.
As a mathematician and software developer I've read a lot of QC papers. There are a lot of buzz words the authors didn't seem to understand and hype that they didn't seem to want to actually support which is not exactly what I expect from scientific papers. After two decades of this I'm pretty confident quantum computing is the fusion power of the computing world.
This was said about machine learning from the 80s to 2012 as well. Furthermore quantum computing is not computer engineering. It is foundational research at the moment. Judging quantum computation based on the social merit is proof in itself of the usability of said field. Mostly it is still in the field of very foundational physics research. Just think about judging string theory or quantum gravitation and comparing them to some semi-classical theories in regards to how many new materials have been created. (Semiconductor, polymers, etc.) Most fields in physics and mathematics are not judged based on the relavanace to a society and we have a field here, which is. I find it even harder, when the person who critisizes quantum computation worked her entire life in quantum gravitation, which definently has not produced anything that can solve a real-world problem^^.
@@DatDaDu That was a lot of buzz words with nearly zero meaning. String theory is nonsense and everyone who knows anything about physics and mathematics knows that. The people still pushing it are cranks or grifters. That you are not aware of that makes you either a crank, grifter or ignorant of basic reality. Quantum gravity is a fundamental problem that has to be solved. Gravity exists at the macro level. Why does it "disappear" at the quantum scale? Again your fundamental ignorance of the current state of real world physics and mathematics is astounding and disqualifying.
Quantum computers have advantages in certain applications, specifically computing exponentially. Also you would have to question how much energy is used by a conventional computer vs a quantum computer for the same task.
For the first time someone helped me understand what a quantum computer is. The key words were "data storage". I realize that's obvious to some but now I get it......I think. It seems to me the idea behind quantum computing is to allow the computer to "multitask" but on a crazy level.
My handwaving big cheat rule of thumb interpretation of quantum mechanics says that for all objects heavier than the Planck mass, we can replace the Schroedinger equation by a classical system with the addition of ordinary classical Brownian motion. The Heisenberg Uncertainty Principle is replaced by the Fuerth Uncertainty Principle on the same scale. Just suppose this were true, what would it mean for quantum computers? Rapid and excessive decoherence? My interpretation fails on Bell’s inequalities, superfluidity and chain reactions by the way. We can still distribute computer simulations incorporating it with a random number generator, and ask everybody to think of something better. It is possible that we may go for a hybrid simulation with a mixture of this big cheat for the detectors and the something better for the microworld. My own idea of something better is tachyonic Brownian motion which begins by being orthogonal to everything else. A detector based on TBM might be at least as complicated as two molecules of nitrogen tri-iodide, which is the motivation for sticking with the big cheat.
if im alive in 50 years (doubtfully; nuclear war, climate, change, my spouse who is upset at me leaving the toilet set up) , I want to come back to this video and look at how we are going back and forward on the judgment of their practicality and laugh
If a photon behaves as a wave and is pure energy, how can it travel in a wave 🌊 without having additional force or loosing some energy? Because the up down movement must take some force even if it doesn't have mass. Are the spacial/gravitational fields sandwiching the photons along there trajectories? Also If you do the double split experiment 100 times, do all the particles always get detected in the exact same places, or are they randomly detected? Thanks
Great video, finally one not hyping Quantum Computing. We are so far from understanding quantum, we don't know if quantum computers are even possible. Is there anything useful that "quantum computers" we have have actually calculated so far, after decades since the first one was "created"?
The question isn’t what quantum computers can do, in theory. It’s about what we have actually used them for and how we utilize the generated outcomes. It doesn’t matter if we have a functional QC. It’s only matters that it’s function is useful
None of the algorithms I've seen so far make use of entanglement. It's all just based on the phase of the superposition of the two states of each qubit, and the phase kickback between target and control qubits in controlled operations. The algorithms are either really contrived (is my oracle balanced?), or are actually hiding an unreasonable number of operations (quantum phase estimation, Shor factoring) in a way which doesn't make sense considering a quantum circuit is not actually a physical circuit.
Have we done sufficient testing to see what can cause wave collapse? I wonder what different things are able to cause the wave collapse. Can a bird, bug, electron , photon, sound, vibration cause the wave collapse or is centralized around people and detectors?
You didn't mention *time complexity.* Yes, you did mention how regular computers have an advantage that makes them seem as if they're always faster. But it's important to note that QCs may have better asymptotic complexity than RCs/CCs. The only way to know (as of now) is via mathematical predictions, since we can't do those benchmarks
@@charlesmott1267 EPR is but is also not related to spooky action. He was referring to instantaneous measurement update; this can be confirmed by the basic timeline of when he used those words relative to EPR. Sabine does have a video on this.
LOVE that confused look of the emoji Maybe a quantum computer can help to narrow down the possible range of the mass of an elusive particle, so that you can save time needed to find it with a fitting detection device? This is the only practical use I can imagine.
Haha love it "10 years in silicon valley is only worth 5 minutes in China" 😂 The crazy part is it's not far off, that's why we in Chinese Tech companies beat MANY us companies until the US "free and open market" the cornerstone in capitalism needed retraction and now is only true if it's on US terms 🤦 加油🇨🇳👍
I am just going to chime in from the “quantum chemistry” field. There are certain calculations that can be sped up dramatically by a quantum computer (for those interested “configuration interaction”). Although, it would recently seem that the easiest way how to do this is via a hybrid model, where the “quantum computer” acts more like a dedicated card (imagine a graphics card) and the bulk of the calculations is still done using a regular computer.
Hello, I sincerely thank you for your contribution to the development and popularization of science. I would also like to express my deep gratitude to you for making a video with science news because in the modern world there are very few sources from which you can hear truly important research and scientific discoveries
I wouldn’t be too pessimistic still - think of very early photography incl. lab time vs painting. Those grainy early pics and the equipment cannot touch what a modern phone camera can do. So, we‘ve seen dramatic improvement before. When thinking of the true purpose or niche quantum computer eventually might occupy - the only one I really believe in is optimization tasks. Logistics in the first place. But once this works, more interesting fields will take the stage with their specific optimization problems. Think of optimizing a whole product including its design, manufacturing, logistics, and business model including a fair bit of uncertainty from the outside world. That’s big $. While it’s hard/impossible to show that quantum computers will eventually master such tasks, it is easy to show that conventional computers will never (that NP thing).
How do you do a quantum calculation? Quantum mechanics is based on probability. You don't know for sure. How do you put together a series of steps when it's uses uncertainty on a fundamental level?
Quantum computers are deterministic up until you perform a measurement. However, sometimes you can manipulate probabilities so that the probability of the measurement being X thing is 100%, making the algorithm fully deterministic. As a simple example, a quantum algorithm made up of 2 hadamard gates in series. If you send a base state, the first hadamard sets it into a superposition but the second one undoes it perfectly, leaving you with the original state.
Don't regular processors have a lot of redundancy and error correction built in? As far as I know in the beginning processors had to be perfect with no defects or they were scrap. Now with 100 million plus, one or two transistors are likely to be faulty. Are the quantum computer developers taking the semiconductor lessons on board?
When people talk about quantum computers they see them as the future when they really need to be seen as computers nearly half a century behind and will spend another half decade catching up to a technology that is still advancing.
To be honest, part of my issue with many articles like this one is even if I agree with the science behind it; I feel mixed about their conclusion because an issue when discussing practicality that comes up is that it in reality it is often much more of a comparitive lense than people make it. There are many things which exist that have already proven their practicality to experts in the intial usage of them, but because that usage isn't as relatable to the public eye they often need to find ways to expand their usage. This is often the case with breakthrough innovations. It isn't neccsarily always that they truely have no usage, but about the attempt to expand the usage to more generalized fields.
A fuzzy wrong language question.. In the topic of limited usability of quantum computers is it because of conditional branching calculatiins i supose. Some of them can be simulated by extending more the horizontal solution area, but i supose perhaps there are cases in wich you cant map de time dimension in the quantum solution space (an implicit calculability irreductability wich wolfram loves jjja). On the other side you debunked the last back in time quantum experiments, but if that finally came true dont you think it will be explotable to solve de inherent secuencial calculatiin of some computer problems? Is it correct that the problem is that some computer algorithms are inherently secuencial and becouse of that cant exploit quantum propertoes? If that is the case could some apparently inherently secuencial could be later be converted to a quantizable solution space? Are back in time quantum efects totally ruled out or just one particular experiment debunked? I know lots are more biased towards a more consistent predeterministic explanation to quantum, but any idea if there are posibilities of being able to explote quantum efects in any type of algorithm.?
Can we use the WKB approximation to set estimates for how much misinformation can tunnel from desperate quantum computing project managers through into science media and when? We can assign the depth of the energy well proportional to the commercial branding value unlocked by publication of the misinformation. The when part applies to pre and post debunking. Who wants to republish provably wrong news ?
Meeting authors in person they stress that paper was about utility, not about advantage) Although, it is funny when the papers are published where they use IBM's Qiskit and laptop to outperform quantum processor, you can see that all classical methods do not give accurate and coherent results when used out of classical methods applicability range and significantly differ from quantum simulation. It is a question for next papers to show whether quantum computations outperform classical algorithms in terms of parameters range.
4:35 I've found in a couple different areas of life that messy, unconventional methods of solving problems often leads to the development of faster, more efficient conventional methods of solving the problem. It was the act of the messy, unconventional solution that spurred the development of the orderly, conventional method. Could that be what we're seeing here? Quantum computing may not actually be better at doing computations, but it's better at devising solutions for conventional computing to do computations.
Every time I see an expert talking about quantum computers, it's the same conversation: "the bits of the quantum computer are not limited to 0 or 1, instead, they can take on any value and this makes the computer much faster..." My question is the following: If this is so, then what is the difference between a quantum computer and an analog computer? Analog computers also operate with parameters that can assume any value and are even faster, they are instantaneous. Maybe it would be a good idea, before digging deeper into quantum computing, to go back into the past and try to see how analog computers could be more efficient using the materials and means we have today.
Buddy, they are just trying to imitate the brain through quantum computing. Can't beat the biological computer yet, and can't draw hands properly😂. And again buddy, what is analog computer? Some abacus or spinning wheel with numbers gears or something, like bombee? Those are incredibly slow-poke.
This is at least the second time I've heard a "Someone found a conventional computing way to do this thing faster than a quantum computer". Thus far, the biggest benefits from quantum computing have been encouraging us to learn how to make normal computers do stuff faster. Well, it's something at least.
lots of these theorems work by 'dequantizing' the quantum algorithm. Don't know if this is the case.
This is about understanding math rather than quantum physics. If one uses quantum computer to solve a problem traditional computer can handle, this is a logical conclusion. It is also possible it was a hoax, to obtain money for the research.
And the ability to factor large numbers becomes less and less attractive when cryptography already is moving away from such algorithms.
Corn chips work good for some things. Gas holograms work good for other things.
@@distiking Siphoning money that could be actually be put to good use elsewhere is the crowning achievement of quantum computing. It is otherwise a useless gimmick.
The only reason quantum computers are deemed useful for financial analysis is because they're marketed as being useful for financial analysis. They're marketed that way because they're marketed toward investors.
Next marketing strategy:
Quantum Computers are faster at growing investor's returns.
It is also for turbulance simulation etc chemistry 😊
@@eddyr1041 This hasn't been demonstrated. The cycle goes like this:
1) Quantum computer huckster says "quantum computers will be better at X."
2) Somebody shows a barely functional or purely theoretical version of X on a quantum computer, or more likely, an idealized model of an error-free quantum computer.
3) Somebody else shows that conventional computers can do X just fine.
True
Yes & no. To win on the market speed is everything... down to the .000001 of a second
We could use quantum computers to better understand quantum mechanics, if only we better understood quantum mechanics
By better understanding quantum mechanics, we can build better quantum computers…
QM classicalized in 2010. Juliana Mortenson website Forgotten Physics uncovers the hidden variables and constants and the bad math of Wien, Schrodinger, Heisenberg, Einstein, Debroglie,Planck, Bohr etc.
THEORY OF EVERYTHING IDEA: Revised TOE: 1/24/2024a:
TOE Idea: Short version: (currently dependent upon the results of my gravity test):
The 'gem' photon is the eternally existent energy unit of this universe.
The strong and weak nuclear forces are derivatives of the electromagnetic ('em') interactions between quarks and electrons. The nucleus is a magnetic field boundary. 'Gravity' is a part of electromagnetic radiation, gravity acting 90 degrees to the 'em' modalities, which of course act 90 degrees to each other. 'Gravity' is not matter warping the fabric of spacetime, 'gravity' is a part of spacetime that helps to make up matter. The gravity and 'em' modalities of matter interact with the gravity and 'em' modalities of spacetime and the gravity and 'em' modalities of spacetime interact with the gravity and 'em' modalities of matter.
TOE Idea: Longer version: (currently dependent upon the results of my gravity test):
THE SETUP:
1. Modern science currently recognizes four forces of nature: The strong nuclear force, the weak nuclear force, gravity, and electromagnetism.
2. In school we are taught that with magnetism, opposite polarities attract and like polarities repel. But inside the arc of a large horseshoe magnet it's the other way around, like polarities attract and opposite polarities repel. (I have proved this to myself with magnets and anybody with a large horseshoe magnet and two smaller bar magnets can easily prove this to yourself too. It occurs at the outer end of the inner arc of the horseshoe magnet.).
3. Charged particles have an associated magnetic field with them.
4. Quarks, protons and electrons are charged particles and have their associated magnetic fields with them.
5. Photons also have both an electric and a magnetic component to them.
FOUR FORCES OF NATURE DOWN INTO TWO:
6. When an electron is in close proximity to the nucleus, it would basically generate a 360 degree spherical magnetic field.
7. Like charged protons would stick together inside of this magnetic field, while simultaneously repelling opposite charged electrons inside this magnetic field, while simultaneously attracting the opposite charged electrons across the inner portion of the electron's moving magnetic field.
8. There are probably no such thing as "gluons" in actual reality.
9. The strong nuclear force and the weak nuclear force are probably derivatives of the electro-magnetic field interactions between quarks and electrons. In the case of the alpha particle (Helium nucleus), the electro-magnetic field interactions between the quarks themselves are what keeps them together in that specific structural format.
10. The interactions between the quarks EM forces are how and why protons and neutrons formulate as well as how and why protons and neutrons stay inside of the nucleus and do not just pass through as neutrinos do. (The neutrino being a substance with a very high gravitational modality with very low 'em' modalities.)
11. The nucleus is probably an electro-magnetic field boundary.
THE GEM FORCE INTERACTIONS AND QUANTA:
12. At this time, I personally believe that what is called 'gravity' is a part of electromagnetic radiation, gravity acting 90 degrees to the 'em' modalities, which of course act 90 degrees to each other. 'Gravity' is the force which allows a photon to travel across the vast universe without that swirling photon being flung apart or ripped apart by other photons and/or matter interactions. Gravity being a part of the 'em' photon could also possibly be how numbers exist in this existence for math to do what math does in this existence (the internal oscillations of the 3 different parts of the 'gem' photon, each modality having a maximum in one direction, a neutral, and a maximum in the other direction.) 'Gravity' is not matter warping the fabric of spacetime, 'gravity' is a part of spacetime that helps to make up matter. The gravity and 'em' modalities of matter interact with the gravity and 'em' modalities of spacetime and the gravity and 'em' modalities of spacetime interact with the gravity and 'em' modalities of matter.
13. I also believe that the 'gem' photon is the energy unit in this universe that makes up everything else in this universe, including eternally existent space and time. ('Space' being eternally existent energy itself, the eternally existent 'gem' photon, 'Time' being the eternally existent flow of energy, 'Space Time' being eternally existent energy and it's eternally existent flow).
14. When these vibrating 'gem' photons interact with other vibrating 'gem' photons, they tangle together and can interlock at times. Various shapes (strings, spheres, whatever) might be formed, which then create sub-atomic material, atoms, molecules, and everything in existence in this universe.
15. When the energy units unite and interlock together they would tend to stabilize and vibrate.
16. I believe there is probably a Photonic Theory Of The Atomic Structure.
17. Everything is basically "light" (photons) in a universe entirely filled with "light" (photons).
THE MAGNETIC FORCE SPECIFICALLY:
18. When the electron with it's associated magnetic field goes around the proton with it's associated magnetic field, internal and external energy oscillations are set up.
19. When more than one atom is involved, and these energy frequencies align, they add together, specifically the magnetic field frequency.
20. I currently believe that this is where a line of flux originates from, aligned magnetic field frequencies.
NOTES:
21. The Earth can be looked at as being a massive singular interacting photon with it's magnetic field, electrical surface field, and gravity, all three photonic forces all being 90 degrees from each other.
22. The flat spiral galaxy can be looked at as being a massive singular interacting photon with it's magnetic fields on each side of the plane of matter, the electrical field along the plane of matter, and gravity being directed towards the galactic center's black hole where the gravitational forces would meet, all three photonic forces all being 90 degrees from each other.
23. As below in the singularity, as above in the galaxy and probably universe as well.
24. I believe there are only two forces of nature, Gravity and EM, (GEM). Due to the stability of the GEM this is also why the forces of nature haven't evolved by now.
25. 'God' does not actually exist except for as a concept alone. The singular big bang theory is a fairy tale for various reasons. The CMBR from the supposed 'bang' should be long gone by now and should not even be able to be seen by us. Red Shift observations have a more 'normal' already known physics explanation, no dark energy nor dark matter needed. The universe always existed in some form and never had a beginning and will most probably never have an end. Galaxies collapse in upon themselves, 'bang', eventually generating new galaxies. Galaxies and 'life' just come and go in this eternally existent existence.
DISCLAIMER:
26. As I as well as all of humanity truly do not know what we do not know, the above certainly could be wrong. It would have to be proved or disproved to know for more certainty. Currently, my gravity test has to be accomplished to prove or disprove that portion of the TOE idea. But, if not this way, then what exactly is the TOE of this existence?
@@charlesbrightman4237 “The Final Theory: Rethinking Our Scientific Legacy “, Mark McCutcheon for proper physics.
Damn you quantum mechanics! You ruined quantum computing!
Maybe one of the biggest advantages of quantum computing is simply that it drives us to figure out new ways to optimize classical algorithms to do things we always thought we needed quantum computers for.
Truly.
Only in theory. The practical algorithms people use in real life aren't affected by these efficiency results.
@@benjaminblack91At a personal level sure. But the world is run digitally thanks to scale over those algorithms
@@benjaminblack91fast quantum chemistry calculations would have practical application in people's daily lives, if successful for say, drug development. Obviously it's a big 'if' though.
@@benjaminblack91 yes they are. just because you don't do molecular dynamics in your daily life, doesn't mean others don't
The reality distortion field is not accounted for in the Standard Model of physics, but it permeates almost everywhere in Silicon Valley. The RDF attracts money, and a lot of it, and somehow makes it disappear from normal 4D space.
Mind bending stuff, I think you could publish this and get funding for an…..”experiment”
Ah yes, the fundamental force - money
Money as a concept or money as cohesion?
this comment reads like Hitchhiker's Guide to the Galaxy
@@RyanLynch1 Yes, but only in 42 dimensions.
Does this mean that Quantum Computing is done and Not done?
...Yes and no.
Simultaneously... But not at once.
Probably still in Hadamard state
not a chance
Yes.
If I doesn't run Solitaire, it's dead on arrival.
It has minesweeper though, which is obviously superior!
But can it run doom?
@@nhart4325glitchy as hell, like god intended😂
Rumors say It can run Crysis which goes to show they're a bunch of liars.
I know this is a joke but quantum computers would actually be horrible for displaying graphics and running games
You need to understand that IBM employs "breakthrough announcements" as a marketing strategy. It enforces the belief that they are using advanced technology in their products. It's the same reason that Ford, Porsche, and Ferrari enter Grand Prix events.
This is the tech industry way.
Well, currently of those three only Ferrari take part in F1.
If all the QC 'breakthroughs' turn out like this it will just make them look foolish but they will still have their investors cash.
Google doesn't receive the same criticism for claiming quantum supremacy in 2019 for a task that actually takes 5 minutes on a classical computer?
Google gets hate for plenty of other things…like making a straight up racist AI
I have never thought that achievements of a racing team somehow translates to technology used in mass production cars. It is only has merit in the production car classes when they have minimum modifications. People who familiar with tech just a bit know that the gap between experimental tech and mass produced tech is very big.
Last year, I read an IEEE article by IBM that said they had created a method of dealing with the inaccuracies of quantum computers. The problem was that it required a tremendous amount of supercomputer time to perform their algorithm. And when it came down to it, it was quicker and cheaper to run simulations on a conventional computer than on a quantum computer that had its data processed by their error correction algorithm. This was hailed as a great advance in quantum computing.
I wonder if there will be an ESTIMATION technique that is discovered which can be used as a cheap, high-peed way to bypass the supercomputing.
It won’t be perfect, but “good enough” to enable progress fwd.
Well:
1) Publication is a must
2) A publication always finds a positive result
3) Any knowledge gained is positive, so if we have proven for a subset of conditions that a procedure is completely unpractical, the knowledge must be spread via publication.
It can be argued if publication is the best way to spread the fools error's, but rule number 1 says 'publication is a must'.
@@Tuning3434 Publish or perish
It may not be practical but it might provide inspiration for creating more practical techniques
@@imeakdo7 Sure. But people keep going on about quantum computing like it is useful now. It isn't. In a way, it's like graphene. We've all been told that graphene is going to revolutionize the world. But decades after its discovery, they've managed to find a few practical applications for the stuff, but the graphene revolution hasn't happened.
I'm not suggesting that people give up on quantum computing any more than I'm suggesting that they give up on graphene. I'm just suggesting that they allow realism to tinge their optimism.
I really hate it when people see or hear news about something which will "revolutionize science" like the line in Saudi Arabia or the hyperloop and say "it will revolutionize urbanistics" or "it will revolutionize transport" by giving the same arguments made in the news article without knowing a single thing about what happens there. The same goes for quantum computers.
Here, I'm going to explain something you should know but you don't.
Every bit, and I mean EVERY bit, of "popular press" or "reliable sources" is either propaganda for a product, propaganda for a social movement, or propaganda for a government. It's obvious it is.
The dumbest humans that exist are reporters for the Socialists... "Don't ever believe your own lying eyes" they all scream loudly.
@@fuzzywzhei'd say in many cases, media it isn't propaganda for a product, but the product itself. Commercial media makes up stories that sell to their audience.
@@fuzzywzheand it really depends what you include as reliable sources. A good bit of information provided is actually verifyable, but few people put in the effort of doing so, and even true information can be used to support false claims
@@roger5059 It's a waste of time filtering through the garbage of our "news" to find facts. You CAN find them, but they are always incomplete at best.
We created the Internet not to compliment television and radio and newspapers, but to eliminate them.
Media is nothing but a commercial from everything to soap to social attitudes to war. That's why it has to be centrally controlled, to keep "on message".
"10,000 years in Silicon Valley are worth 5 minutes in China" ROTFLOL
动态网自由门 天安門 天安门 法輪功 李洪志 Free Tibet 六四天安門事件 The Tiananmen Square protests of 1989 天安門大屠殺 The Tiananmen Square Massacre 反右派鬥爭 The Anti-Rightist Struggle 大躍進政策 The Great Leap Forward 文化大革命 The Great Proletarian Cultural Revolution 人權 Human Rights 民運 Democratization 自由 Freedom 獨立 Independence 多黨制 Multi-party system 台灣 臺灣 Taiwan Formosa 中華民國 Republic of China 西藏 土伯特 唐古特 Tibet 達賴喇嘛 Dalai Lama 法輪功 Falun Dafa 新疆維吾爾自治區 The Xinjiang Uyghur Autonomous Region 諾貝爾和平獎 Nobel Peace Prize 劉暁波 Liu Xiaobo 民主 言論 思想 反共 反革命 抗議 運動 騷亂 暴亂 騷擾 擾亂 抗暴 平反 維權 示威游行 李洪志 法輪大法 大法弟子 強制斷種 強制堕胎 民族淨化 人體實驗 肅清 胡耀邦 趙紫陽 魏京生 王丹 還政於民 和平演變 激流中國 北京之春 大紀元時報 九評論共産黨 獨裁 專制 壓制 統一 監視 鎮壓 迫害 侵略 掠奪 破壞 拷問 屠殺 活摘器官 誘拐 買賣人口 遊進 走私 毒品 賣淫 春畫 賭博 六合彩 天安門 天安门 法輪功 李洪志 Winnie the Pooh 劉曉波动态网自由门
It may be the case we're trying to build a 22nd century computer in the 21st century. Apparently, people imagined 20th century computers hundreds of years ago but simply did not have the technology to build one.
Kudos. A reasonable answer.
Like the Analytical Engine, which is a fully-fledged programmable (punched cards) Turing-complete general-purpose computer with real programs (many written by Ada Lovelace). Its inventor, Charles Babbage, described it in 1837, more than 100 years before Zuse, but only built parts of it. There are interesting discussions available from this time, e.g. CISC vs. RISC or widening arithmetics.
And fortunately - did not spent a lot of money on that
@@dmitripogosian5084 why fortunately? Because it would have worked, but the information age would have been bad for mankind? Or because it would have been too difficult? They showed a few years back that the tolerances of that time would have been good enough to build it.
@@sebastianwittmeier1274 You mean the statement of original comment that technology of hundred years ago was not there to built computers is wrong ?
Just needs more RGB lighting.
Make sure to set it to red. The red ones go faster.
@@cameron7374 you're what's wrong with today's quack science... blue ones obviously go faster!
They added colors!? 🤩
@@cameron7374they appear red due to the red-shifting
Blue is better for cooling. Saves a fortune in Liquid Nitrogen.
Your videos remind me of watching Scientific American and NOVA as a kid, there was always this magical feeling to seeing what's new in science and how it might change the world. Now I get to enjoy this same feeling as an adult and I'm incredibly grateful for that.
10,000 years or 5 minutes... What's a couple of orders of magnitude between friends? 😂
I'm assuming they meant 10.000 years on a similar sized processor (128 bits) which of course isn't relevant since no modern computer processor is that incredibly tiny.
When no one understands anything, any one can say anything.
This paper was published in 2021, this video was published in 1024. So there is nothing really breaking news there right?
"Couple" like in "9"?
@@stoferb876Google on Oct 23 2019 "Our machine performed the target computation in 200 seconds, and from measurements in our experiment we determined that it would take the world’s fastest supercomputer 10,000 years to produce a similar output."
"Maybe it just means that 10,000 years in Silicon Valley are worth 5 minutes in China". Sick burn.
I've been carrying around a few Hadamard gates in my wallet, you know, just in case, but now it looks like I won't need them for a few decades.
Spooky action at a distance is just entropy over time and distance
Actually whatever she's saying applies for shallow circuit. We all knew this. Nothing new
@@ricardoorellana3350 I think that idea needs to be fleshed out a bit. What theory are you referring to?
The spookiest thing; Is that at some level, Either everything is dependent upon circular logic... Or it just, "Works", Somehow?
Its weird to think about how logic exists for no reason, And nothing existed before reality... Not even emptiness itself! :D
@@ricardoorellana3350
🤣🤣🤣🤣🤣
The Ising model is a model of a model. Is the fundamental stone of statistic physics development and inspire many models of physical phenomena. In a sense, it is the most sucessful testbed in science. So, Sabine, Ising model deserves more credit! :)
At MIT I learned “All models are wrong. Some are better are others.” So a model of a model makes me MUCH less confident of the (model of a) model .
Sounds bit like: Today's scientists have substituted mathematics for experiments, and they wander off through equation after equation and eventually build a structure which has no relation to reality.
-You know who.
@@baomao7243you'd be wrong, statistical physics is one of the most successful parts of physics and the ising model (like much of stat phys) can be applied into many different problems with great explanatory power
@@baomao7243 i think its a dimensional reduceed covariance or correspondence of a model, with the right choice of symetry , its great. like 2d topology theory explainig 3d observation. or captureing a smooth manifold or a set of qunatized vortices as spin numbers. it in set theorety , not modelling every detail but just the charcateristics.
@@DamianHallbauer The key is just to know the assumptions and fully understand where it breaks.
Conclusion: even in useless applications, quantum computers are slower 😆
Shallow circuit yes😅
For deep circuits it's a different game. For effective simulation it implies P = BQP.
there seem to be only one task where they would be better it's prime factorisation. Without that application there would be less hype in quantuum computers imho.
@@charlesboyer61 Damn. You spent half of your worthless comment trying to establish credibility
@@aaabbb-py5xdBro thought this was linkedin
@@charlesboyer61good comment. don't listen to these haters
The outrageous claims made by some of the so-called quantum computing players speaks volumes about science in general. It's less about discovery and passion for breaking new frontiers and more about pulling the wool over the eyes of investors. I'm sure the "ours is bigger than yours" competitions will continue though, like it is with AI and advances.
Once upon a time it was to discover new knowledge for the sake of new knowledge, and if it ended up useful then so be it. But the feudal patrons of old, who funded science out of a desire to seem generous to the arts, were replaced by the money makers who instead leveraged funding into projects and fields that would make them money. The explosion of engineers, applied scientists who don't care about new knowledge for knowledge's sake, have been required to keep the science pimping machine humming. Woe be upon you if you study something "useless" as a scientist, as you will have to beg and scrape for even small amounts of money.
Quantum computers do have ridiculous theoretical advantages, however.
This is already known for a long time, 70% of all science paper are garbage, only less than 1% is actually cited
Love the name man
The future of quantum computing is beginning to sound like that of nucleat fusion. With so many little roadblocks as well as major technological challenges that stand in the way I can't help be feel we will start to hear, _"It'll be ready in five more years, and then five more years after that, and then in five more years we'll just need five more years until all the kinks are worked out..."_
the only limits to our ability is our imaginations.
The only difference is that we know fusion will eventually work.There is no QC equivalent of the H bomb.
It's the same hype train as always, only clothes change. You can detect what it is by looking at what it burns. ($€£¥...)
Focus fusion is staring us right in the face and much more scalability.
It’s weird how it’s not getting more funding
@@naamadossantossilva4736And we already get almost all our energy from fusion already, just not fusion that we can control.
Usage of the transverse Ising model? The classical counterpart is NP complete and lays the foundation not only for all these optimization problems (finance, traffic optimization etc) but can be used for protein folding as well.
For the error stuff - did you look at STA methods? Not perfect but quite interesting.
I thought the major use for quantum computers was hacking passwords.
That's always how it was sold to us as kids.
"A quantum computer could hack every password instantly" if I recall...
@@ucantSQ Not password. CIpher. Besides Russian favorite one, one time pad cipher, because that's impossible to crack but shh, quiet with that inconvenient detail :
It's called Shor's algorithm
@@jordanzo7465 with the classical version called the Birthday Pet algorithm. The classical version here again being slightly faster.
except that quantum-resistant cryptography algorithms for classical computers already exist, and will be widespread long before quantum computers are
Sorry, but the Ising model is not a purely academic curiosity. My professor said it's the "drosophila" or "harmonic oscillator" of statistical physics and I agree, since it's exactly solvable in some cases (serves as a toy model/ case study for new methods) and has plenty of applications, because there are mappings from different systems onto it (e.g. deep neural networks, magnetism, polymers or gases/ liquids). This particular quantum version can be mapped to a classical version where we find the many applications.
I remember you saying a year ago that the quantum hype bubble will soon burst. Looks like we'll be seeing it now.
just one year ago? i saw that 5 to 6 years ago
That was always true.
Those precious, naive, particle-pachinko-players have quantum-bipolar-disorder... :(
Sabine flip flops on this, in a later video she flipped and was more optimistic now she's flopped again. Not quite the Schroedinger effect but we're getting there.
"Has the quantum bubble burst yet, or not?"
"Yes."
Yes and no @@gcewing
Sabine, what about Andrew Childs et. al's paper "Exponential algorithmic speedup by a quantum walk"? Why does no one ever talk about this paper, it clearly demonstrates quantum supremacy in a strong sense (they prove that no classical algorithm could do better). It does unfortunately fall into the category of "completely useless algorithm that quantum computers are provably better than classical algorithms at", but it is nonetheless a "supreme" quantum algorithm, so I thought it would've gotten more attention (I put it right below Shor's algorithm, since Shor's algorithm is slightly more useful but it's not going to make my internet faster or anything like that).
Very interesting video. I'm only a CS BSc, but do we really practically need a quantum computer to be able to tell whether or not there will be a quantum time/space speedup in general for algorithmic problems in a certain complexity class? The theoretical science says enough right? As far as we know at least, because we still can not proof if P = NP or P != NP. But to proceed with this, we need money, and for money this needs to be marketed as bringing some kind of value/revolution, while we really have no clue if this is really going to be the case, but I might have missed some new found studies in the last 10-15 years.
Quantum computers is right up there with AI and cryptocurrency in being hyped technologies used to keep the investments flowing into R&D so tech companies can keep building those advanced and expensive computers. The demand driven by smartphones and streaming services has plateaued.
Absolutely hilarious! The gif vs jif snippet. Love your sense of humor!
- Get out!
- "Jet out"!
You know what, conventional computers are made by quantum mechanic technologies and that's why it's this powerful. Digital signals in very high transistor density is similar to quantum mechanics with 2 superpositions input for each qubits. Of course, conventional computers try to prevent quantum tunneling, while quantum computers try to use it.
If we were to concern about the calculation speed per bits/qubits, qubits is 100,000x faster (if the input response can keep up to populate them all the time). But, increasing number of transistors by adding chip communications for parallel processing by that amount costs 100x less than adding 1 more qubits, it also cost less energy overall in the same active CPU/QPU time.
So... what's the hope of quantum computing? Just like GPU involved from video cards, IPU (instantaneous processing unit) could potentially emerge from high temperature quantum computers with compact super cooler. Like for our general purpose uses, only 6 qubits is enough for accelerating lots of things.
This story sounds similar to that of D-Wave in 2013 (which I was somewhat involved in on the classical side). News reports at the time claimed that D-Wave's 439 (later 512) qubit adiabatic quantum computer was 10^8 times faster than classical algorithms for some task and 3600 times faster for another. The only trouble was they were comparing with terrible classical algorithms. It turned out that using a good classical heuristic, similar to belief propagation, that exploits the "thinness" of the underlying graph easily outperformed D-Wave's results.
I haven't understood the Tindall et al paper properly, but it looks on a surface reading that something similar might be happening here with the IBM Eagle, in that it is based on a heavy hexagon lattice which, like the D-Wave graph of 2013, is only sparsely connected and has treelike correlation. The paper mentions that their method of belief propagation works well in such cases, which sounds similar to what I and others were finding in 2013.
(If anyone is interested in revisiting the D-Wave story, see Scott Aaronson's blogs numbered 1400, 2555 and 3192.)
There were plenty of papers pretty quick pointing out that Dwave wasn't really doing quantum calculations.
As someone who works with one of the best quantum information scientists of my country let me tell you something I said and he agreed:
"Quantum computing can either swim or sink but It will do so in this decade" quantum comunication will probably still be a thing because It is simpler to implement and It offers real advantages to security and criptography, and we all know how much armies love thosse 2 things. But a computer to run quantum algorithms Is still totaly up for debate If It will ever come to be.
Quantum cryptography was talked about decades ago. I'm sceptical it will be cheap and physically resistant enough to see real-world use.
@@btf_flotsam478 I mean I never read the article, but I know that China transmited quantum information by satelite to Viena.
And even my country of Brazil was able to do experiments on the Rio De Janeiro bay area of transmiting quantum Information across the bay. Again never read the articles so I dont know what was the sucess rate.
Altought just to be claer my teatcher just agreed with the part about sink or swin In this decade, this part of quantum criptography was all my own tought.
@@nicolasreinaldet732 I am just stating my observations. I remember reading about quantum cryptography yonks ago and suspect the reason I haven't heard more of it is because of those technical hitches that are why these things completely vanish.
The things that get hyped up as world-changing always disappoint, while the actual world-changing technology is incredibly boring. The Gauge Block set is one of the most important inventions ever made in all of history, but most people don't even know they exist and the inventor K.E Johansson almost went bankrupt trying to sell them.
I agree on most part. But AI will take away most our jobs. It's beyond powerful. It has an IQ on Einstein Level and can replace even lawyers (also us physicists and mathematicians)
@@howmathematicianscreatemat9226 "AI" is as smart as the data it was used to train it.
the greatest revolution of the entire history of science is very likely just the modern algebra language.
it went nearly unnoticed, it happenned very very slowly, starting with arab numerals, getting enriched with operators, new notations, and so on... keep in mind there was a time when "the squared distance between two points equals the sum of squares of the difference in each coordinate between the points" was the entire pythagorean theorem, it was the only way to write down its formula.
@@howmathematicianscreatemat9226 it has no iq given it doesn't even understand anything it vomits out
1) It's kind of obvious that a highly optimized conventional algorithm have all chance to be more efficient that a crude version of a quantum algorithm that nobody knows how to optimize.
Quantum algorithmic is confusing and just at its beginning.
2) Even if we assume we have a problem that we know for a fact to be in a complexity class "better" than a classical complexity class, a small instance of the problem can still be harder in the former class (Just think of linear time complexity vs quadratic time with very high constant...... this complexity class hierarchy is only valid asymptotically) .
Today quantum computing only "works" for small instance of problems (because of the lack of stable qbits.)
Sounds like quantum computers have the same issue as fusion reactors: "We will have it figured out in 10 more years if you keep funding our research".
Can't expect everything worthwhile to happen in one lifetime
Except fusion is actually useful
well mabye it willl be practical on the moon.. helium 3, low pressure is cheap, cooling is cheap, an leaks or explosion, can just be mabye cast into deep space.. the net energy gain is related to cooling costs and vaccums and such. quantum computers cant worko the moon, superconducting ones might be easy an dcooling isnt as hard if there is a big condutor, and the heat can be reused, not just blown by a fan.
Fusion research has been at it long enough now to actually demonstrate incremental improvement. But yes, both suffer from the same fundamental problem of human impatience. The problem of a society and economic system that want a 3-10 year return on investment in spite of being made up of a species that took THREE HUNDRED MILLENIA to discover writing. It's a sad reality of our species that we're only motivated by short term goals, admitting "this will take 100 years of continuous R&D" would basically mean humanity would shelve it and probably follow the dinosaurs into death-by-spacerock waiting for the "right time" to start working on it.
Except we know that fusion reactors are at least possible to create because nature has lots of examples. We still don't know if a real practical quantum computer can be ever done. A few qubits for a few microseconds is a cool toy, not a practical machine.
Modern computers wouldn't exist without understanding gained from quantum physics. Are they not also quantum computers?
Well proving you could perform that specific task faster with a system we have much more experience with, after the initial test isn't surprising.
Bit of an unfair comparison. Even if Quantum computing would only be useful for niche applications we can learn a lot by experimenting with it.
Not surprised.
Every time for, i'm not even sure how many years now, when someone claims they've reached the stars in regards to quantum computing, and i go and actually bother to read up on the claim fully, every time i find it to be extreme levels of hype and completely unrealistic.
I've been pessimistic about the idea ever since first time i read up on it, simply because there's just no good way to make it work in any CONVENIENT way.
It's possible to achieve various successes, sure, but they're in laboratory conditions and simply cannot be transferred over to a functional workplace environment. Not even as mainframes of the old "house-sized computers" era.
And far too often, they essentially do not even work properly.
So, seems like i get to start throwing around "i told you so" to various people.
Again.
The perfect woman doesn't exi...
Quantum computing is one of those very hard technical goals. Is it something that should be pursued? I believe so. Is it something that we will realize in the near term? Probably not.
The reason I think this is important research is the understanding of how things work at a quantum level is actually quite important to our societal future. As we continue to scale sizes down, we do become more dependent on handling quantum issues. We might not ever get to a quantum computer that is worth the cost, though I believe the major issues with cost will be over come, but what we learn along the way is priceless.
3:41 dramatic black screen 👁️👄👁️
editor blacked out on this mouthful 😂😂😂
I think that it is not emphasized enough how much, in a way, the interest in quantum computing goes way beyond solving logistic problems, factorizing big numbers and inverting matrices. Quantum compunting is the ability to control and mesure systems at the most foundamental level know. Its a (sophisticated) version of learning how to move rocks, manipulate matter, syntesize new stuff with chemical reactions and so on, there is just no way it will not be useful. We already achieved a level of control at the quantum level unthinkable in the 90's on many phisical platforms, neutral atoms, Ions, supercoducting quibits, even Nuclear resonace. Optical twezers, laser cooling and other tequniques were at least partially developed for this porpuses and they have all kind of applications in low energy physics. We should also not forget that without the ability of producing entagled states we would still not know if hidden variable where a thing, and without squeezed states no gravitational waves can be detected. Not to mention that we can understand if decoherence is intrinsic in phisical interaction (whether becouse of gravity or no) only by triyng to build larger and larger coherent objects, which is kind of what Q.computing is.
It hasn t anything to do with nuclear fusion, even though the continuus (very annoying ) claims of IBM ang google that by 10 years they will have a zillion qubit smartphone make it sound the same way.
6:00 omg Sabine is on fire 🔥🔥🔥💅😂
She's simply great
I found it worthwhile learning how to program quantum computers. It helped me a) understand quantum mechanics and b) demonstrate to myself that they are waaay off being useful just yet
Same here.
Toying with a quantum computer sim is fun
@@affegpus4195 well you can also run code on a real quantum computer. It's interesting to see how noisy they are compared to a sim.
UA-cam thanks for thinking I can understand any of this, it's a nice compliment and all, but no I'm not that smart lol
Quantum Computing is ultimately about a faster ALU, but most of the compute task's time is spent elsewhere in staging the relevant data not the actual arithmetic manipulation in ALU. Even If the time delay through the John Von Neumann Bottleneck at ALU was reduced to zero, almost all of the total runtime required for typical computation-intensive tasks would remain. Centrally triggered, distributed computing is needed.
Yeah, and if he returned, he'd be appalled to see we're still architected as if high-bandwidth networks and essentially free compute units didn't exist. @@retiredbore378
Thank you for the video! I completely agree with the points made about the increasing prevalence of anger in the US. Despite its problems, growing up in the US, I didn't perceive as much anger as I do today, I earn over $370000 in investment.
You're right! A good coach can provide personalized advice and help you overcome mental blocks that prevent you from achieving financial success.
Everyone needs a different stream of income, unfortunately having a job doesn't mean security due to the high rate of tax, one needs to move ahead of their expectations
I used to work 3 jobs, full time at Walmart, a server at night and did lyft on the weekends still only make ends meet, investing with Larry Marshall gave me the opportunity that allowed me to work away from the rest race
I have made more than 596k God bless, Larry Marshall Webb, God America.
My money stays right in my trading account, my account just mirrors he's trades in real-time that's the idea behind copy trading
Yes. This is why you should always trust it.
Case in point... the first 15 seconds of this video explain to you why you SHOULD question everything. Everything is science. Everything is "right". Until... it suddenly isn't.
what did einstein mean then with spooky action at a distance?
I always thought he meant entanglement and google and chatgpt also say so, so i cant find the real meaning
Sabine please answer. From en.wikipedia.org/wiki/Einstein%E2%80%93Podolsky%E2%80%93Rosen_paradox "The thought experiment involves a pair of particles prepared in what would later become known as an entangled state. "
On the subject of Einstein, it is time to give up the springy Einstein prop on your otherwise excellent science videos. Einstein was the greatest and most transcendent of many great physicists. He doesn't need to parodied by a springy prop, and I don't think any of us should try to second guess what Einstein would say or think almost 70 years after his passing. The prop might get you more youtube views, so perhaps you are doomed to continue this gimmick.
I dont think people realize just how important the advancement to the speed of material science is.
The ammount of undiscovered potential chemicals, alloys, and other materials is frankly staggering. The amount of stuff you could do if you had the right material is crazy.
Trotterization is not a specification of the model, that is a specific way of solving the problem. In fact, it's quite general. One can use it for any given Hamiltonian to do the quantum phase estimation experiment.
The last time i investigated quantum computing i was told that you need a system next to the QC which is almost as massive as the calculation you are trying to do using the qbits.. meaning, you don't actually need the qbits in the first place.
was told, "Quantum mechanics is probably one of the best understood branches of all of physics.", this morning by someone that couldn't tell me how long quantum links last.
this is an entertaining field of physics.
If I'm not mistaken, the first quantum computer was brought out by Canadian quantum computing research company D-Wave barely more than a mere decade ago, yet the utility of quantum computing technology is brought into question because it doesn't already meet expectations in always outperforming classic computer systems? 🤔🤨
I find your approach to explaining excellent and the humour is excellent as well. I was delighted to see the audio only podcast in the description.
I suspect that quantum computers will turn-out to be even less useful because of Digital MemComputing (DMM) - which is a new type of computing system that may provide strong competition for certain problems (such as factorisation). At the moment they are flying under the radar, but they seem to be making great strides towards building practical systems, so they will surely gain attention sooner or later. Most recently they I read they managed to implement a DMM using an off-the-shelf FPGA, which could be a game-changer.
I only came across DMM recently, and am still absorbing how it works, but basically it's using memresistors (or circuits carefully configured to behave like them, until memresistors can be cheaply fabricated) to implement digital logical circuits that that can (potentially!) solve sufficiently large problems thousands or even millions of times faster than traditional CPUs.
How does it manage such big speed-ups? Their high-level explanation is a bit too opaque for my liking, but from what I can tell all components work in parallel towards finding the solution (effectively testing many solutions in parallel), whereas a traditional CPU can only test one solution at a time. So the larger a DMM can be made, the bigger the speed-up it will get. e.g. They say they can solve NP-complete problems in polynomial time & resources, where-as traditional computers would take exponential time.
Another way of looking at it is they run digital logical circuits both forwards (as is traditional) & backwards at the same time (which is initially mind bending), until it finds a solution that works for all logic circuit elements at once - which feels a bit like how quantum computers also work.
What are the differences between parallel analog computing and quantum computing?
Classical IBM marketing, see past years when IBM has claimed early leadership in Cloud Computing, et al. But that's ok, MSFT, GOOG, ... all have "eager Marketing" efforts. QC is perhaps going to be a new paradigm of computing, next to analog and digital computing. Lots of challenges to go but a 'killer app' for QC will be breaking cybersecurity as we know it today.
@@phoenixrising164 No it's not, the term 'Cloud' came when Ray Ozzie @ MSFT proposed it as a new strategy and some folks (incl. @ IBM) all went "..what! That's time-sharing again!" I was @ MSFT then and I had a front-row seat when it all happened.
Doesn't a superposition of three or more states require exponentially fewer linkages? (for the same computations) Qbits is then a misnomer.
I wish i knew what you are talking about, but i'm happy someone here is more involved in quantum computing than the average viewer, whose knowledge about quantum physics comes from short youtube videos, such as myself :). I hold my thumbs for all those who have the brains and resources to explore this field. I hope one day we will have reliable quantum machines. The reason i gave up on quantum is the expensive equipment that is necessary to experiment and get the "feel", what it actually does. Intuition is useless to explore it, and as a hobbyist, i was left with math and whatever others have discovered that could be a dead end, and still there are still many questions along the way no book can answer. My position is that this field is reserved for those who dedicated their lives/careers.
They are soo desprate for positive quantum computer news to sustain funding that they are basically bussy trying to do just that. And even that is not succeeding. I think quantum computing will never be viable. The more qbits. the more noise. The more noise the more q-bits you need to compensate for the noise. Its exponential. For quantum computers to be useful . they have to consist of enough q-bits. It might very well be the case that we will never be able to get a useful quantum computer. Unless we somehow learn more about physics and find some way to reduce noise that we have not been able to previously. Which doesn't look likely at this point. I don't even think we will have economical fusion reactors soon or at all. But i deem fusion reactors more viable than quantum computers lol
Entanglement doesn't exist in conventional computers?
I'll have to disagree with you on this one, Sabine.
Maybe you haven't seen my parents cable management strategy for their computer from the early 2000's.
There is literally no other word to describe that thing but entangled.
As a mathematician and software developer I've read a lot of QC papers. There are a lot of buzz words the authors didn't seem to understand and hype that they didn't seem to want to actually support which is not exactly what I expect from scientific papers. After two decades of this I'm pretty confident quantum computing is the fusion power of the computing world.
This was said about machine learning from the 80s to 2012 as well.
Furthermore quantum computing is not computer engineering. It is foundational research at the moment.
Judging quantum computation based on the social merit is proof in itself of the usability of said field. Mostly it is still in the field of very foundational physics research. Just think about judging string theory or quantum gravitation and comparing them to some semi-classical theories in regards to how many new materials have been created. (Semiconductor, polymers, etc.)
Most fields in physics and mathematics are not judged based on the relavanace to a society and we have a field here, which is. I find it even harder, when the person who critisizes quantum computation worked her entire life in quantum gravitation, which definently has not produced anything that can solve a real-world problem^^.
@@DatDaDu That was a lot of buzz words with nearly zero meaning.
String theory is nonsense and everyone who knows anything about physics and mathematics knows that. The people still pushing it are cranks or grifters. That you are not aware of that makes you either a crank, grifter or ignorant of basic reality.
Quantum gravity is a fundamental problem that has to be solved. Gravity exists at the macro level. Why does it "disappear" at the quantum scale? Again your fundamental ignorance of the current state of real world physics and mathematics is astounding and disqualifying.
Quantum computers have advantages in certain applications, specifically computing exponentially. Also you would have to question how much energy is used by a conventional computer vs a quantum computer for the same task.
Software developers are typically taught that the algorithms have more to do with program running times than processor speeds past a certain point.
For the first time someone helped me understand what a quantum computer is. The key words were "data storage". I realize that's obvious to some but now I get it......I think. It seems to me the idea behind quantum computing is to allow the computer to "multitask" but on a crazy level.
My handwaving big cheat rule of thumb interpretation of quantum mechanics says that for all objects heavier than the Planck mass, we can replace the Schroedinger equation by a classical system with the addition of ordinary classical Brownian motion. The Heisenberg Uncertainty Principle is replaced by the Fuerth Uncertainty Principle on the same scale.
Just suppose this were true, what would it mean for quantum computers? Rapid and excessive decoherence?
My interpretation fails on Bell’s inequalities, superfluidity and chain reactions by the way. We can still distribute computer simulations incorporating it with a random number generator, and ask everybody to think of something better. It is possible that we may go for a hybrid simulation with a mixture of this big cheat for the detectors and the something better for the microworld. My own idea of something better is tachyonic Brownian motion which begins by being orthogonal to everything else. A detector based on TBM might be at least as complicated as two molecules of nitrogen tri-iodide, which is the motivation for sticking with the big cheat.
if im alive in 50 years (doubtfully; nuclear war, climate, change, my spouse who is upset at me leaving the toilet set up) , I want to come back to this video and look at how we are going back and forward on the judgment of their practicality and laugh
Coffee is bad 4 u. I mean good.
i hope ur not alive with that mentality!!!
If a photon behaves as a wave and is pure energy, how can it travel in a wave 🌊 without having additional force or loosing some energy? Because the up down movement must take some force even if it doesn't have mass. Are the spacial/gravitational fields sandwiching the photons along there trajectories? Also If you do the double split experiment 100 times, do all the particles always get detected in the exact same places, or are they randomly detected? Thanks
Any introductory elecrodynamics or quantum mechanics textbook might help you there.
@@Tomyb15 could you recommend any ?
I love the Feynman reference, "No one understands quantum mechanics ",My name is Bicycle Bob and I approved this message.
Great video, finally one not hyping Quantum Computing. We are so far from understanding quantum, we don't know if quantum computers are even possible. Is there anything useful that "quantum computers" we have have actually calculated so far, after decades since the first one was "created"?
Are there any crazy awesome applications of Quantum mechanics out there?
The question isn’t what quantum computers can do, in theory. It’s about what we have actually used them for and how we utilize the generated outcomes. It doesn’t matter if we have a functional QC. It’s only matters that it’s function is useful
None of the algorithms I've seen so far make use of entanglement. It's all just based on the phase of the superposition of the two states of each qubit, and the phase kickback between target and control qubits in controlled operations. The algorithms are either really contrived (is my oracle balanced?), or are actually hiding an unreasonable number of operations (quantum phase estimation, Shor factoring) in a way which doesn't make sense considering a quantum circuit is not actually a physical circuit.
Have we done sufficient testing to see what can cause wave collapse? I wonder what different things are able to cause the wave collapse. Can a bird, bug, electron , photon, sound, vibration cause the wave collapse or is centralized around people and detectors?
How's that world changing HYPER-LOOP coming along? "Hype" is literally built into the name.
Would a hybrid quantum processor and conventional computer system be useful?
You didn't mention *time complexity.* Yes, you did mention how regular computers have an advantage that makes them seem as if they're always faster. But it's important to note that QCs may have better asymptotic complexity than RCs/CCs. The only way to know (as of now) is via mathematical predictions, since we can't do those benchmarks
Great content, glad the algorithm lead me here, subscribed!
"No that's not what Einstein meant by spooky action at a distance" - what did he mean? I always thought he was talking about entanglement
Sabine, please enlighten us. A lot of us think of the EPR paradox as synonymous with entanglement and closely connected with Bell's work.
@@charlesmott1267 EPR is but is also not related to spooky action. He was referring to instantaneous measurement update; this can be confirmed by the basic timeline of when he used those words relative to EPR. Sabine does have a video on this.
LOVE that confused look of the emoji
Maybe a quantum computer can help to narrow down the possible range of the mass of an elusive particle, so that you can save time needed to find it with a fitting detection device? This is the only practical use I can imagine.
Haha love it "10 years in silicon valley is only worth 5 minutes in China" 😂 The crazy part is it's not far off, that's why we in Chinese Tech companies beat MANY us companies until the US "free and open market" the cornerstone in capitalism needed retraction and now is only true if it's on US terms 🤦
加油🇨🇳👍
Another banger Sabine, continue the outstanding work!
I am just going to chime in from the “quantum chemistry” field. There are certain calculations that can be sped up dramatically by a quantum computer (for those interested “configuration interaction”). Although, it would recently seem that the easiest way how to do this is via a hybrid model, where the “quantum computer” acts more like a dedicated card (imagine a graphics card) and the bulk of the calculations is still done using a regular computer.
"Maybe ten thousand years in Silicone Valley are only worth five minutes in China". Savage.
Hello, I sincerely thank you for your contribution to the development and popularization of science. I would also like to express my deep gratitude to you for making a video with science news because in the modern world there are very few sources from which you can hear truly important research and scientific discoveries
I'd be very interested to hear your thoughts on Garrett Lisi's E8 theory and the subsequent research done by the Quantun Gravety Research institute.
Helpful insights. Thanks, Sabine. Best, Carlo
I wouldn’t be too pessimistic still - think of very early photography incl. lab time vs painting. Those grainy early pics and the equipment cannot touch what a modern phone camera can do. So, we‘ve seen dramatic improvement before.
When thinking of the true purpose or niche quantum computer eventually might occupy - the only one I really believe in is optimization tasks. Logistics in the first place. But once this works, more interesting fields will take the stage with their specific optimization problems. Think of optimizing a whole product including its design, manufacturing, logistics, and business model including a fair bit of uncertainty from the outside world. That’s big $.
While it’s hard/impossible to show that quantum computers will eventually master such tasks, it is easy to show that conventional computers will never (that NP thing).
How do you do a quantum calculation? Quantum mechanics is based on probability. You don't know for sure. How do you put together a series of steps when it's uses uncertainty on a fundamental level?
Quantum computers are deterministic up until you perform a measurement. However, sometimes you can manipulate probabilities so that the probability of the measurement being X thing is 100%, making the algorithm fully deterministic.
As a simple example, a quantum algorithm made up of 2 hadamard gates in series. If you send a base state, the first hadamard sets it into a superposition but the second one undoes it perfectly, leaving you with the original state.
Don't regular processors have a lot of redundancy and error correction built in? As far as I know in the beginning processors had to be perfect with no defects or they were scrap. Now with 100 million plus, one or two transistors are likely to be faulty. Are the quantum computer developers taking the semiconductor lessons on board?
When people talk about quantum computers they see them as the future when they really need to be seen as computers nearly half a century behind and will spend another half decade catching up to a technology that is still advancing.
To be honest, part of my issue with many articles like this one is even if I agree with the science behind it; I feel mixed about their conclusion because an issue when discussing practicality that comes up is that it in reality it is often much more of a comparitive lense than people make it. There are many things which exist that have already proven their practicality to experts in the intial usage of them, but because that usage isn't as relatable to the public eye they often need to find ways to expand their usage. This is often the case with breakthrough innovations. It isn't neccsarily always that they truely have no usage, but about the attempt to expand the usage to more generalized fields.
This is the kind of in-depth news reporting I enjoy. Thanks.
I love your attention to detail to follow up past "claims" to the present time where they are debunked.
A fuzzy wrong language question..
In the topic of limited usability of quantum computers is it because of conditional branching calculatiins i supose.
Some of them can be simulated by extending more the horizontal solution area, but i supose perhaps there are cases in wich you cant map de time dimension in the quantum solution space (an implicit calculability irreductability wich wolfram loves jjja).
On the other side you debunked the last back in time quantum experiments, but if that finally came true dont you think it will be explotable to solve de inherent secuencial calculatiin of some computer problems?
Is it correct that the problem is that some computer algorithms are inherently secuencial and becouse of that cant exploit quantum propertoes? If that is the case could some apparently inherently secuencial could be later be converted to a quantizable solution space?
Are back in time quantum efects totally ruled out or just one particular experiment debunked?
I know lots are more biased towards a more consistent predeterministic explanation to quantum, but any idea if there are posibilities of being able to explote quantum efects in any type of algorithm.?
Can we use the WKB approximation to set estimates for how much misinformation can tunnel from desperate quantum computing project managers through into science media and when?
We can assign the depth of the energy well proportional to the commercial branding value unlocked by publication of the misinformation.
The when part applies to pre and post debunking. Who wants to republish provably wrong news ?
Meeting authors in person they stress that paper was about utility, not about advantage) Although, it is funny when the papers are published where they use IBM's Qiskit and laptop to outperform quantum processor, you can see that all classical methods do not give accurate and coherent results when used out of classical methods applicability range and significantly differ from quantum simulation. It is a question for next papers to show whether quantum computations outperform classical algorithms in terms of parameters range.
4:35 I've found in a couple different areas of life that messy, unconventional methods of solving problems often leads to the development of faster, more efficient conventional methods of solving the problem. It was the act of the messy, unconventional solution that spurred the development of the orderly, conventional method. Could that be what we're seeing here? Quantum computing may not actually be better at doing computations, but it's better at devising solutions for conventional computing to do computations.
Every time I see an expert talking about quantum computers, it's the same conversation: "the bits of the quantum computer are not limited to 0 or 1, instead, they can take on any value and this makes the computer much faster..."
My question is the following: If this is so, then what is the difference between a quantum computer and an analog computer? Analog computers also operate with parameters that can assume any value and are even faster, they are instantaneous.
Maybe it would be a good idea, before digging deeper into quantum computing, to go back into the past and try to see how analog computers could be more efficient using the materials and means we have today.
Wow... Somehow your comment is so ignorant that I don't even know where to begin making fun of it.
Buddy, they are just trying to imitate the brain through quantum computing. Can't beat the biological computer yet, and can't draw hands properly😂.
And again buddy, what is analog computer? Some abacus or spinning wheel with numbers gears or something, like bombee? Those are incredibly slow-poke.
I wonder if you will ever do a video on the catastrophic effects of the way researchers are looking for funding...