Check out my new course on Technology and Investing in Silicon: www.anastasiintech.com/course The first 50 people to sign up get 25% off with the code “EARLY25”.
@TheBlueMahoe no man. Delegate! Make someone else do their strong suit and you stick to yours. Using collaboration and delegation to make a real team and fill in the gaps of each other's competence.
I though years ago about using a comparator neural net using arrays of D/A converters & comparators for near instantaneous results. Basically a DLN will have perform a huge array of comparisions to detemine a match. But this could all be done using an analog array. using D/A to create a comparison node value that feeds into one side of the comparator. This would be extremely energy efficient & as well as fast. Imagine a chip with a 100M analog comparator nodes using a couple of watts of power.
The idea of p-bits acting as a bridge between classical and quantum computing is mind-blowing! Could this be the practical ‘quantum’ tech we need before full quantum computers are ready? cool
@@Nandarion Yes so it seems that p-bits can solve with the equilibriums where the variance is converging, and then q-bits can solve where they diverge too.
@ I don’t need a course. Just a very high level description of the way you program with p-bits. Hardware advances are great. But the value of hardware is achieved through software. So it’s a great concept, but software will define its success.
@@roch145 "very high level description" sounds like a course; but, yes, the software is required. But isn't mathematical literature already being implemented as software for these quantum computers? To my recollection, the "software" is what enabled the hardware to begin development in the first place (seeing as the 'soft' is the structured thought and the 'hard' is the physical body) - but it's definitely something that needs to be more publicly enticing...
Would you reckon it will be able to be used in even more real ray tracing? Where these methods are used to cast a whole different order of magnitude rays in random directions, where we would cast from the light sources, reacting with materials (maybe based on actual physcs and photon interaction), where only a very tiny margin will reach the camera. Just like irl. I hope im making sense here.
Yep. I foresee these analog systems working in tandem with classical Von Neumann, Turing machines, all in the same box. So the CPU offloads a task to the analog chip, then takes result back to classical land.
@@DeltaNovum It's hard to foresee exactly how things would come to be, but one thing is for sure, once we humans are able to abstract functionality behind a layer, we find all kinds of novel ways to use it. Just look at what we were able to do with data and arithmetic logic gates.
The universe is probably deterministic. Our lack of ability to see all the variables means it looks random. Although we might be able to build machines to see more variables, to see them all we would probably need more material than the universe has.
I know you are probably ;) joking@@ip6289. But in case you aren't: there is a difference between using a word in an epistemological context versus in an ontological sense. His use of the word "proba bly" in the first sentence is in the former sense.
@@JVerstry I also missed the computer shopper in the sense that you could only go through it once really to find what you wanted and wouldn’t have to doom scroll all day and look at videos about stuff that misinform you
@@JVerstry I just bought a $2000 VR headset that I’m waiting until February or March, and after the fact, I found out that they’ve never shipped a product yet even though they’ve announced two other products
A computers memory is limited to the number of transistors it can use when computing (constants, variables, coefficients). When it is computing information is fed to it sequentially. The result of an Algorithm occurs as combinational logic. An analogy would be that computing is like making a bag of microwave popcorn, where each kernel is data (constants, variables, coefficients). Assume that (in this analogy) the kernels pop randomly, but once they are popped they are no longer a kernel. They are popcorn. So they are moved to a different part of memory (called the result). This frees up the initial memory so it may be used by the Algorithm. The Algorithm can speed up to finish the job faster because it can use more memory and therefore do parallel processing as in combinational logic. This saves time and energy.
@@garyrust9055 I think she understands conventional chip architecture. Plus i was under the impression its propositional, sequential and combinatorial logic used in a low level architecture
Very interesting finds. Hope they manage to iron out the drawbacks before another tech breaks daylight. I've read about analog computers in the early computing age and if this technology arrives, it has become full circle. Great video. Thanks!
Thanks Anastasi for the informative content as always! Also, what is your bet that Graphene Processors could accelerate this further and shorten the time from conceptual phase to first hardware testing setup? Cheers and keep up with the amazing content!
Super interesting & super well presented. As I understand it, the solution obtained by a Boltzmann or reduced Boltzmann machine are minimums in the parameter space defined by the energy of each state & the total energy of the system. Boltzmann showed that the probability of a given state is proportional to the exponent of the energy divided by the temperature & the true probability is obtained by multiplying the system temperature by Boltzmann’s constant. It is a brilliantly simple model that works with physical systems & has been adopted by the two winners of this years physics Nobel prize to create AI systems that find solutions as minimums in the model space using Boltzmann’s equation. It is sad to recall that Boltzmann took his own life a little before his ideas became accepted. Thank you for sharing!
There is "Monte Carlo Integration", which is a method of computing by random sampling, that has been known for several decades. To approximate the area of a shape in a rectangle, you could try covering just the interior of the shape with tiny boxes and then the total area of all the boxes is the approximate area of the shape; this is the classical computation. Alternatively, randomly choose a thousand points in the rectangle and count how many are inside the shape. This gives a statistical estimate of the area of the shape. As the number of dimensions of the rectangle increases (i.e., shapes in N-dimensional boxes), the numerical error associated with the classical computation tends to grow more quickly than the numerical error associated with random sampling, I recall. The "probabilistic computing" discussed in this video reminded me of these "random sampling" methods.
Just had a crazy idea neural network related, I wonder if it has been tried. Basically instead of having one weight per neuron we would have 2, one being the normal activation weight the other would be a "functional" weight and this particular "weight" would decide what function is performed by this neuron instead of having all neurons on a layer perform the same computation.
Makes sense to me. A typical nuron speak to others with chemical signals as well as electrical ones. Unknowledgeable enough to know how it would work though.
Interesting idea. If you can figure out how to train such system, it could be used as an optimization (improve latency or energy efficiency). If I remember correctly, it has been shown mathematically that using just single non-linear function for every neuron is enough to have same computational abilitities (AI counterpart of Turing machine). However, the proof is about what's possible, not about what's easy/fast to compute.
I read about Josephson junctions in the 1980s Proposed as a means of low noise superconducting switches. Seeing them used to bridge p-bits makes a lot of sense althoughi wonder about the scalability of bridging the stochastic behavior with low temperature junctions. Neurons use relatively slow, but programmable activation potentials and they work at body temperature. Just some random thoughts but this was a great topic and I really appreciate it so thank you!
Also we can get faster classical c-bit computing with optical computing for precise algorithms. This p-bit tech seems suitable for AI especially. At some point we will have q-bit for similar purposes and for some advanced stuff.
Basically, you are using the universe, and its randomness, as part of your system. It reminds me of DNA, which doesn't just make things ex nihilo. It plugs into the environment to make things in a cooperative manner.
I love this. DNA was my goto analogy when describing the difference between code and software to a team of scientists (DNA being code and the phenotype/animal being its software) The initial part of your comment "...using the universe and its randomness" reminded me of Stephen Wolfram and his ruliad idea. He would say "using the universe and its computation".
@@trudyandgeorge "using the universe" ? By that glitzy narrative , so is asphalt cooling at night and heating the next day. Thermodynamics was always a cerebral subject.
The logical progression for electron computing is photon and graviton computing as in laser splitting graviton entangled photons. Sandaero/Aeronet, Solaser and Uniqua.
Cool!!❤❤ A semiclassical approach to computing i guess? But somehow it sounds alot like D-wave's quantum annealing to me. But i'm probably just confused. Anyway, great video 👍👍👍👍
I commented about this kind of leap forward about a month ago, that there would be some advancement that would GREATLY enhance efficiency of AI compute. And here it is! Thank you Nastya! 😂
We discussed this about a decade ago, Except instead of super cooling we were going to exploit the characteristics of tunnel diodes, where they could be manufactured on silicon substrates and work at room temperatures.
There is still a lot of optimization possible in classical computing. It is easier and logical to take advantage of sub-nanometer fabrication, and cram more in less space and use classical circuits to do binary operations. The energy use in a classical digital circuit is due to inefficient use of the electron, that just passes between power and ground rails, as a logic operation is performed. If instead the potential is raised to several hundred volts, and made to perform logical operations, through a chain of capacitors(which are as small as they could be, and does not need to hold charge more than a few microseconds), each degrading only a few microvolts on its passage, we will have far more efficient computation. We could then reset the entire logic through pulsed femtosecond lasers(similar to the diodes used in q-switched lasers, found in fiber laser markers can be easily adapted), to reset for the next logical operation. Currently the capacitive switching(apart from the miniscule crowbar current) is the primary factor that determines the efficiency of computing. The suggested approach should significantly reduce the power consumption due to reduced charge/discharge cycles between the supply rails.I do have an advanced degree in microelectronic design and fabrication and have few decades of industry experience, working at all levels. Just some 💭❤👍
Great info dense content. It brings to mind the work on the travelling sales man problem of efficient route planning which was modeled with a biological analogue in slime mold that could have simple physical elements to model on a small ,scale areas with routes geometrically proportional to real world logistic scenarios.
Yeah, so P-bit is a feedback machine set to automatic to find the "least obstructive" path forward. Simple solution, and common sense, like water flowing downhill, or choosing the best time and point to cross a road. If a problem "roadblock" occurs, the system (computation) backs up until it spatially (computationally) recognises an overflow to a new pathway. Elegant
Analog computers have two serious disadvantages: There is no error correction for calculations and their hardware can only solve a single problem quickly. How can error correction be solved analog? Through correlation? How can you build an analog universal computer? Neural networks seem to be able to do it. It would be interesting to apply the inverse Z-transformation to digital data models to build analog solutions and see what comes out of it. There are already FPGAs that can calculate much faster than digital solutions. Digital signal processing is also becoming faster with special hardware, e.g. in signal processors. Overall, a middle way between digital and analog data processing will develop. E.g. sum (inputs) of each neuron with Schmitt trigger for output spike analog. Multiplicants in synapses digital. Trained data is loaded digitally as with FPGAs. Calculation is analog.
As a computer scientist and software engineer. I see great niche cases for p-bits or stochastic bits. But 100% of software is 99% deterministic, even when stochastic elements are put inside "deterministic cages". So I want to stress that nondeterministic computing has only niche uses within a deterministic framework, for the great majority of tasks for which humans want to program digital solutions.
So it's using entropy to compute? If so, how is this programmable? It seems similar to other technologies where the model is embedded within the hardware. It seems like you would need to start in some specific negentropy state as your input. Maybe I'm missing something and someone can enlighten me.
As some one studying active inference in AI which is fundamentally probableistinc. This is quite exiting. As well as the free energy and bayesian inference.
the probability that you agree with the other person becomes smaller and smaller due to entropy. What if the smallest disorder (high entropy) again forms a large, whole "world view" (B. Mandelbrot). There is a saying "history repeats itself". We should think about the fact that as soon as we have "invented/found" something, we should deepen it. To the point that we have found another large structure and researched it again down to the smallest parts. If we look at nature, the last few centuries are always a good example of how we were able to explain small knowledge to the large; and back again with research. (Higgs Boson / gravitational lensing) Cheers to everyone.. have a hug, if you like it ;)
I would love to see you explaining also the history of computers, for example the Stibitz half-adder, it was the first step, and it should be given much more credit with simple explanations and tutorials.
Whenever new better things are big improvements, it is normally a combination of the 2 things before it. Taking the pros from both and limiting the cons of each
Thank you once again for a marvelous video! I had no idea this was out there! I want to hear more about it. One question I have is, are probability distributions key in this new technology? Can p-bits be operating with different probability distributions? Can they be forced into particular probability distributions? Are such distributions the key to how a probability algorithm works? Another thing I'm wondering is, what would be an example of a probability algorithm, a basic one that might let us see how these machines work? I'm also wondering about interconnections between P bits. Is there such a thing as probability gates and how do they work? And you mentioned that information flows in both directions to and fro between p-bits, but your example was in a single direction. I want to know more about this bi-directional characteristic. How is it achieved in the circuit itself? And finally, I'm wondering about the mathematics used to represent probability circuits.
Interesting! If p-bits and probabilistic circuits become realizable in practice it would be the kind of paradigm shift in technology that Ray Kurzweil has talked about.
Super cool, wow ;) Boltzmann chips with rebel p-bits would fit right into an Alice in Wonderland story, where Queen Entropy, for once, isn’t hostile but actually loves a little chaos! Yes, this is very exciting news! A big congratulations on your course!!
I don't understand, you still cannot run this at room temperature (15:20), it requires cooling down, so what is the advantage of p-bits over q-bits? Also what are the disadvantages of p-bits versus q-bits? Considering they are non-entangled, doesn't that make the inferior to quantum computing?
I was eating ice cream while watching this and I had to stop because one side of my teeth froze and once the temperature in my head equilibrated I knew that I wanted more ice cream. Conclusion: entropy will make me gain weight :(
I had no idea that research money was being spent for this.....seems random but then again to come up with something new you sometimes must think outside the box. Interesting video thank you!
Tim Palmer's book, The Primacy of Doubt, uses these ideas for modeling chaotic systems. If you haven't read the book, I think you will enjoy it. Weather forecasting is inherently chaotic and uses noise similar to your discussion...
I like it, a stochastic computer at last! Many tasks such as complex project scheduling currently use deterministic tools and methods. Nature doesn't work that way.
Interesting technology. about probability. I remember first learning about computers in college. I was always fascinated about random numbers. and I used to think, well, how could a computer generate a random number if everything is so precise? now that I'm retired, I still enjoy learning new things.
The idea of harnessing noise in computing certainly sparks interest, but I remain doubtful about its real-world application. The promise of achieving 100 million times more energy efficiency is extraordinary, but such a significant leap in efficiency raises questions about the feasibility of its claims. For one, we’re dealing with a technology that embraces inherent unpredictability-this is a major departure from the precise nature of digital computing, where certainty and reliability are key. As we scale up probabilistic computing, the challenges around controlling the noise, ensuring consistent outputs, and maintaining the stability of computations across diverse scenarios remain largely unaddressed. While it may have potential in specific applications, I’m not convinced it’s ready to replace or even coexist with traditional computing in fields that demand the precision and stability we rely on every day, such as healthcare or finance. Until we see more practical, scalable demonstrations, it’s difficult to view this as a universally applicable solution.
It's already working, just like how they have quantum computers working. It's just a matter of refining the technology to make it better and competitive with current solutions.
Could this technology be used find prime factors of very large numbers? Perhaps the most common claim for quantum computers is that they can break RSA encryption by doing that.
For almost two years now I've been saying that doing AI using digital was completely broken. A neuron is an op amplifier, and what we're doing using SIMD to multiply, accumulate, then apply an activation function is just a super-expensive emulation of the op-amp. I just don't know how fast we could make op-amp work at the node technology used by CPUs , it might be possible that digital remains faster but I strongly doubt it. I'm still waiting for an analog AI chip.
An interesting topic. I wonder when this will become accessible for regular consumers. Computers and gadgets using this technology will need to be practically rebuilt from the ground up and will make everything we've had so far pale in comparison. A major technological (r)evolution!
Question: if you have random bits, don't you still have to figure out an efficient way to read them out for computational use? Are the "computations" random and embedded into the memory somehow? This part was confusing to me.
this will help to have sound being generated with pure analog output from.a computer since the digital nature of nowadays computers restrict a lot how synthesis output in a signal... exciting concept!
The probabilistic computation will run into is the issue of local minima when parallel calculations are running. This issue is equivalent to the physical phenomenon of density fluctuations near a critical point. So local solutions (maximum entropy) are strongly influenced by nearby minima in entropy.
This sounds like the hardware bridge needed for an efficient interface with a quantum computer. Combine the two and you would have some insane computing power to say the least. 😮❤
By embracing chaos, how do they expect to get correct answers? Computers are about input and output. You know the output is correct if you do it by hand. But if you don't know the answer already, how do you know what the chaos computer is outputting is correct?
@ 11:51 minutes ++ ; this looks very similar to a sculptor bringing out any required form or shape from a block of stone / granite, etc. by chipping off the unwanted parts of the stone for the final shape, or like a potter's mud which can be turned into any object by carving out the object required.
I wonder if you can use radomness of electron tunneling to generate a random value, perhaps you can use an addressable analog voltage gate memory cell to tune the probability of tunneling. Then it can just use normal cmos logic gates for computation.
WONDERFUL! This is EXACTLY what we need! My god. What I could do with that and an ML weighting table! Gosh! And I love to see you having "your world turned upside down". This is _exactly_ what I have seen over and over every time I teach a coder to be a prompter. "NO! Stop trying to make it act like a computer! SURF THE NON-DETERMINISM! Make it _work_ for you." Fantastic.
Check out my new course on Technology and Investing in Silicon:
www.anastasiintech.com/course
The first 50 people to sign up get 25% off with the code “EARLY25”.
Love your content.
Do I need to study Electrical Engineering to do a Probabilistic Computer Start-up company?
@TheBlueMahoe no man. Delegate! Make someone else do their strong suit and you stick to yours. Using collaboration and delegation to make a real team and fill in the gaps of each other's competence.
If you're gonna harvest noise, what about random number generators??
I guess that sounds like synthetic data for stochastic parrots, kinda.... ;*[}
I though years ago about using a comparator neural net using arrays of D/A converters & comparators for near instantaneous results. Basically a DLN will have perform a huge array of comparisions to detemine a match. But this could all be done using an analog array. using D/A to create a comparison node value that feeds into one side of the comparator. This would be extremely energy efficient & as well as fast. Imagine a chip with a 100M analog comparator nodes using a couple of watts of power.
The idea of p-bits acting as a bridge between classical and quantum computing is mind-blowing! Could this be the practical ‘quantum’ tech we need before full quantum computers are ready? cool
Most likely, I been waiting for this for ~8 Years, iirc.
No. q-bits without quantum entanglement are same as p-bits. But to be faster then classical computing entanglement is required. We already had p-bits.
Very ai-bot comment. Genuinely asking are you a human?
@@Nandarion Yes so it seems that p-bits can solve with the equilibriums where the variance is converging, and then q-bits can solve where they diverge too.
Sounds like more vapor-ware marketing speak.
I solved this years ago when I hooked up my stereo to my computer, put on some Zappa, turned it all the way up, and then pulled the knob off...
Saw Frank and The Mothers back in 1971. What a show (Portland, Oregon).
You didn’t really address how p-bits and algorithms and data work together to produce an output or solution.
You have to buy the course. Can't give away all the secrets for free 🤪
@ I don’t need a course. Just a very high level description of the way you program with p-bits. Hardware advances are great. But the value of hardware is achieved through software. So it’s a great concept, but software will define its success.
In all fairness you’d probably need an entire video for that
There is a YT video 6yrs ago on ieee page. Just search "p-bits" and should be top link.
@@roch145 "very high level description" sounds like a course; but, yes, the software is required. But isn't mathematical literature already being implemented as software for these quantum computers? To my recollection, the "software" is what enabled the hardware to begin development in the first place (seeing as the 'soft' is the structured thought and the 'hard' is the physical body) - but it's definitely something that needs to be more publicly enticing...
32K fully ray traced minecraft coming 😌
But can it run Crysis?
no thanks
With atomic sized voxels 🥵
Oh hell yeah 😎
99 googolplex fps
Sounds very similar to adiabatic quantum computing. Useful for solving optimisation problems, but not universal computing.
Would you reckon it will be able to be used in even more real ray tracing? Where these methods are used to cast a whole different order of magnitude rays in random directions, where we would cast from the light sources, reacting with materials (maybe based on actual physcs and photon interaction), where only a very tiny margin will reach the camera. Just like irl.
I hope im making sense here.
Yep. I foresee these analog systems working in tandem with classical Von Neumann, Turing machines, all in the same box. So the CPU offloads a task to the analog chip, then takes result back to classical land.
@@DeltaNovum It's hard to foresee exactly how things would come to be, but one thing is for sure, once we humans are able to abstract functionality behind a layer, we find all kinds of novel ways to use it. Just look at what we were able to do with data and arithmetic logic gates.
rebranded "quantum annealing"
Was not what I thought the video would be about, learned a TON ...loads of cool info, can't wait to see this deployed ! :)
The universe is probably deterministic. Our lack of ability to see all the variables means it looks random. Although we might be able to build machines to see more variables, to see them all we would probably need more material than the universe has.
You first sentence said it all😊
This was essentially Einstein's position in the face of indeterminate wavefunction collapse upon measurement, yes?
I know you are probably ;) joking@@ip6289. But in case you aren't: there is a difference between using a word in an epistemological context versus in an ontological sense. His use of the word "proba bly" in the first sentence is in the former sense.
If Byte magazine was still around today, you would be doing the digital video version of it. That’s exactly what your discussions remind me of.
You should rename the channel to Byte Anastasia. LOL
I miss Byte magazine so much...
@@JVerstry I also missed the computer shopper in the sense that you could only go through it once really to find what you wanted and wouldn’t have to doom scroll all day and look at videos about stuff that misinform you
@@JVerstry I just bought a $2000 VR headset that I’m waiting until February or March, and after the fact, I found out that they’ve never shipped a product yet even though they’ve announced two other products
A computers memory is limited to
the number of transistors it can
use when computing (constants,
variables, coefficients). When it
is computing information is fed to
it sequentially. The result of an
Algorithm occurs as combinational
logic. An analogy would be that
computing is like making a bag
of microwave popcorn, where each
kernel is data (constants, variables,
coefficients). Assume that (in this
analogy) the kernels pop randomly,
but once they are popped they are no
longer a kernel. They are popcorn. So
they are moved to a different part of
memory (called the result). This frees
up the initial memory so it may be used
by the Algorithm. The Algorithm can
speed up to finish the job faster
because it can use more memory and
therefore do parallel processing as in
combinational logic. This saves time
and energy.
@@garyrust9055 I think she understands conventional chip architecture. Plus i was under the impression its propositional, sequential and combinatorial logic used in a low level architecture
@garyrust9055 who wrote that poem?
@@mhamadkamel6891 ChatGPT
Very interesting finds. Hope they manage to iron out the drawbacks before another tech breaks daylight. I've read about analog computers in the early computing age and if this technology arrives, it has become full circle. Great video. Thanks!
Good work Anastasi.
great video as always, thank you for posting Anastasi!!!
Thanks Anastasi for the informative content as always!
Also, what is your bet that Graphene Processors could accelerate this further and shorten the time from conceptual phase to first hardware testing setup? Cheers and keep up with the amazing content!
Super interesting & super well presented. As I understand it, the solution obtained by a Boltzmann or reduced Boltzmann machine are minimums in the parameter space defined by the energy of each state & the total energy of the system. Boltzmann showed that the probability of a given state is proportional to the exponent of the energy divided by the temperature & the true probability is obtained by multiplying the system temperature by Boltzmann’s constant. It is a brilliantly simple model that works with physical systems & has been adopted by the two winners of this years physics Nobel prize to create AI systems that find solutions as minimums in the model space using Boltzmann’s equation. It is sad to recall that Boltzmann took his own life a little before his ideas became accepted. Thank you for sharing!
There is "Monte Carlo Integration", which is a method of computing by random sampling, that has been known for several decades. To approximate the area of a shape in a rectangle, you could try covering just the interior of the shape with tiny boxes and then the total area of all the boxes is the approximate area of the shape; this is the classical computation. Alternatively, randomly choose a thousand points in the rectangle and count how many are inside the shape. This gives a statistical estimate of the area of the shape. As the number of dimensions of the rectangle increases (i.e., shapes in N-dimensional boxes), the numerical error associated with the classical computation tends to grow more quickly than the numerical error associated with random sampling, I recall. The "probabilistic computing" discussed in this video reminded me of these "random sampling" methods.
Just had a crazy idea neural network related, I wonder if it has been tried. Basically instead of having one weight per neuron we would have 2, one being the normal activation weight the other would be a "functional" weight and this particular "weight" would decide what function is performed by this neuron instead of having all neurons on a layer perform the same computation.
Makes sense to me. A typical nuron speak to others with chemical signals as well as electrical ones. Unknowledgeable enough to know how it would work though.
Interesting idea. If you can figure out how to train such system, it could be used as an optimization (improve latency or energy efficiency). If I remember correctly, it has been shown mathematically that using just single non-linear function for every neuron is enough to have same computational abilitities (AI counterpart of Turing machine). However, the proof is about what's possible, not about what's easy/fast to compute.
Well ... build a proof-of-concept !
I read about Josephson junctions in the 1980s Proposed as a means of low noise superconducting switches. Seeing them used to bridge p-bits makes a lot of sense althoughi wonder about the scalability of bridging the stochastic behavior with low temperature junctions.
Neurons use relatively slow, but programmable activation potentials and they work at body temperature.
Just some random thoughts but this was a great topic and I really appreciate it so thank you!
Epic thanks for sharing your knowledge in this exciting future field of computer science! ❤
Also we can get faster classical c-bit computing with optical computing for precise algorithms. This p-bit tech seems suitable for AI especially. At some point we will have q-bit for similar purposes and for some advanced stuff.
and with the same problems our carbon based computers in our heads have i.e. they are not as good as their owners think they are.
Basically, you are using the universe, and its randomness, as part of your system. It reminds me of DNA, which doesn't just make things ex nihilo. It plugs into the environment to make things in a cooperative manner.
I love this. DNA was my goto analogy when describing the difference between code and software to a team of scientists (DNA being code and the phenotype/animal being its software)
The initial part of your comment "...using the universe and its randomness" reminded me of Stephen Wolfram and his ruliad idea. He would say "using the universe and its computation".
keep selling, salesman.
@@kakistocracyusa your comment went over my head. Who's the salesman and why?
@@trudyandgeorge "using the universe" ? By that glitzy narrative , so is asphalt cooling at night and heating the next day. Thermodynamics was always a cerebral subject.
@@kakistocracyusa I see now, thanks. You know, entropy and the second law is most certainly universal.
❤ 3:13 probabilistically yes 🎉
Thanks for presenting all this information on these new platforms - this has cleared up many things I didn't understand about them.
Excellent video. 👍 I remember proposing a similar idea for a lidar project. 👍
Here comes the next layer of the simulation.
The logical progression for electron computing is photon and graviton computing as in laser splitting graviton entangled photons. Sandaero/Aeronet, Solaser and Uniqua.
Cool!!❤❤ A semiclassical approach to computing i guess? But somehow it sounds alot like D-wave's quantum annealing to me. But i'm probably just confused. Anyway, great video 👍👍👍👍
subscribed for the investing stuff
Amazing video, many thanks! 🙏🏻
I always learn something new with your videos. ❤💻
Excellent
Thank you!
I commented about this kind of leap forward about a month ago, that there would be some advancement that would GREATLY enhance efficiency of AI compute. And here it is! Thank you Nastya! 😂
We discussed this about a decade ago, Except instead of super cooling we were going to exploit the characteristics of tunnel diodes, where they could be manufactured on silicon substrates and work at room temperatures.
First time it made sense to me. Thanks!
auto ML can spawn super intelligence with this one
As an HDR monitor owner, I enjoyed getting spooked by the transition at 5:05 😁💛
This video is awesome and groundbreaking!
Using Noise to DeNoise an image. Awesome!
Its been a while i havent watch this channel,but everysingle day there is a breakthrough...
I love your channel so much, thank you for all that you do!
There is still a lot of optimization possible in classical computing. It is easier and logical to take advantage of sub-nanometer fabrication, and cram more in less space and use classical circuits to do binary operations. The energy use in a classical digital circuit is due to inefficient use of the electron, that just passes between power and ground rails, as a logic operation is performed. If instead the potential is raised to several hundred volts, and made to perform logical operations, through a chain of capacitors(which are as small as they could be, and does not need to hold charge more than a few microseconds), each degrading only a few microvolts on its passage, we will have far more efficient computation. We could then reset the entire logic through pulsed femtosecond lasers(similar to the diodes used in q-switched lasers, found in fiber laser markers can be easily adapted), to reset for the next logical operation. Currently the capacitive switching(apart from the miniscule crowbar current) is the primary factor that determines the efficiency of computing. The suggested approach should significantly reduce the power consumption due to reduced charge/discharge cycles between the supply rails.I do have an advanced degree in microelectronic design and fabrication and have few decades of industry experience, working at all levels. Just some 💭❤👍
send a letter to intel
The thermodynamic limit does not care. And your "femtosecond lasers" would automatically make the efficiency totally suck.
Beff Jezos is watching
Great info dense content. It brings to mind the work on the travelling sales man problem of efficient route planning which was modeled with a biological analogue in slime mold that could have simple physical elements to model on a small ,scale areas with routes geometrically proportional to real world logistic scenarios.
Thanks! Awesome job, Anastasi!!
This is obviously a cornerstone tech in the ASI
Yeah, so P-bit is a feedback machine set to automatic to find the "least obstructive" path forward. Simple solution, and common sense, like water flowing downhill, or choosing the best time and point to cross a road. If a problem "roadblock" occurs, the system (computation) backs up until it spatially (computationally) recognises an overflow to a new pathway. Elegant
Analog computers have two serious disadvantages: There is no error correction for calculations and their hardware can only solve a single problem quickly. How can error correction be solved analog? Through correlation? How can you build an analog universal computer? Neural networks seem to be able to do it. It would be interesting to apply the inverse Z-transformation to digital data models to build analog solutions and see what comes out of it.
There are already FPGAs that can calculate much faster than digital solutions. Digital signal processing is also becoming faster with special hardware, e.g. in signal processors.
Overall, a middle way between digital and analog data processing will develop. E.g. sum (inputs) of each neuron with Schmitt trigger for output spike analog. Multiplicants in synapses digital. Trained data is loaded digitally as with FPGAs. Calculation is analog.
This almost sounds like "Hitch-hikers Guide to the Galaxy" stuff!
The "Infinite Improbability Drive"!
Any opinion on the new IBM 156 qbit tunable coupler heron? 🤓
Giving computers intuition without breaking.
The power company.
As a computer scientist and software engineer. I see great niche cases for p-bits or stochastic bits. But 100% of software is 99% deterministic, even when stochastic elements are put inside "deterministic cages". So I want to stress that nondeterministic computing has only niche uses within a deterministic framework, for the great majority of tasks for which humans want to program digital solutions.
q-bit, actually can be many states. Shortest path of the noise the equilibrium determines the constant probabilistic.
How I understand it.
So it's using entropy to compute? If so, how is this programmable? It seems similar to other technologies where the model is embedded within the hardware. It seems like you would need to start in some specific negentropy state as your input. Maybe I'm missing something and someone can enlighten me.
As some one studying active inference in AI which is fundamentally probableistinc. This is quite exiting. As well as the free energy and bayesian inference.
Great presentation. I love your channel!
Thank you so much!
Great and very informative videos, but can you tell me, are you using auto-tune (or pitch shifter) on your vocal?
the probability that you agree with the other person becomes smaller and smaller due to entropy. What if the smallest disorder (high entropy) again forms a large, whole "world view" (B. Mandelbrot). There is a saying "history repeats itself". We should think about the fact that as soon as we have "invented/found" something, we should deepen it. To the point that we have found another large structure and researched it again down to the smallest parts. If we look at nature, the last few centuries are always a good example of how we were able to explain small knowledge to the large; and back again with research. (Higgs Boson / gravitational lensing)
Cheers to everyone.. have a hug, if you like it ;)
I would love to see you explaining also the history of computers, for example the Stibitz half-adder, it was the first step, and it should be given much more credit with simple explanations and tutorials.
Whenever new better things are big improvements, it is normally a combination of the 2 things before it. Taking the pros from both and limiting the cons of each
Thank you once again for a marvelous video! I had no idea this was out there! I want to hear more about it. One question I have is, are probability distributions key in this new technology? Can p-bits be operating with different probability distributions? Can they be forced into particular probability distributions? Are such distributions the key to how a probability algorithm works? Another thing I'm wondering is, what would be an example of a probability algorithm, a basic one that might let us see how these machines work? I'm also wondering about interconnections between P bits. Is there such a thing as probability gates and how do they work? And you mentioned that information flows in both directions to and fro between p-bits, but your example was in a single direction. I want to know more about this bi-directional characteristic. How is it achieved in the circuit itself? And finally, I'm wondering about the mathematics used to represent probability circuits.
Yes, PDF are configurable and it’s beautiful!
Of course the goat of physics told us the way while beating his drums nearly 50 years ago. The goat is obviously Feynman.
Interesting! If p-bits and probabilistic circuits become realizable in practice it would be the kind of paradigm shift in technology that Ray Kurzweil has talked about.
Super cool, wow ;)
Boltzmann chips with rebel p-bits would fit right into an Alice in Wonderland story, where Queen Entropy, for once, isn’t hostile but actually loves a little chaos! Yes, this is very exciting news! A big congratulations on your course!!
We need to build an improbability drive - Hitch Hikers Guide to the Galaxy 😂
sounds like probabilistic computing could be a near-future tech to enable near real time content generation for gaming, I'm interested
I don't understand, you still cannot run this at room temperature (15:20), it requires cooling down, so what is the advantage of p-bits over q-bits? Also what are the disadvantages of p-bits versus q-bits? Considering they are non-entangled, doesn't that make the inferior to quantum computing?
All I want in life is just 10^40 flops. And we shall get there👍
10⁴² flops would give the optimal answer. In fact, every answer.
I was eating ice cream while watching this and I had to stop because one side of my teeth froze and once the temperature in my head equilibrated I knew that I wanted more ice cream. Conclusion: entropy will make me gain weight :(
I had no idea that research money was being spent for this.....seems random but then again to come up with something new
you sometimes must think outside the box. Interesting video thank you!
So fast it finishes before it starts, and on a good day even before the Big Bang.
Tim Palmer's book, The Primacy of Doubt, uses these ideas for modeling chaotic systems. If you haven't read the book, I think you will enjoy it. Weather forecasting is inherently chaotic and uses noise similar to your discussion...
I like it, a stochastic computer at last! Many tasks such as complex project scheduling currently use deterministic tools and methods. Nature doesn't work that way.
Interesting technology. about probability. I remember first learning about computers in college. I was always fascinated about random numbers. and I used to think, well, how could a computer generate a random number if everything is so precise? now that I'm retired, I still enjoy learning new things.
The idea of harnessing noise in computing certainly sparks interest, but I remain doubtful about its real-world application. The promise of achieving 100 million times more energy efficiency is extraordinary, but such a significant leap in efficiency raises questions about the feasibility of its claims. For one, we’re dealing with a technology that embraces inherent unpredictability-this is a major departure from the precise nature of digital computing, where certainty and reliability are key. As we scale up probabilistic computing, the challenges around controlling the noise, ensuring consistent outputs, and maintaining the stability of computations across diverse scenarios remain largely unaddressed. While it may have potential in specific applications, I’m not convinced it’s ready to replace or even coexist with traditional computing in fields that demand the precision and stability we rely on every day, such as healthcare or finance. Until we see more practical, scalable demonstrations, it’s difficult to view this as a universally applicable solution.
So is this computer actually going to work?
Tech engineer pushes back up his glasses with one finger: Probably...
no
But maybe yes. Depending
It's already working, just like how they have quantum computers working. It's just a matter of refining the technology to make it better and competitive with current solutions.
Could this technology be used find prime factors of very large numbers? Perhaps the most common claim for quantum computers is that they can break RSA encryption by doing that.
How long before this technology reaches the server corporate market? Home market?
Thank you.
ты умница !. 👋👋👋👍👍👍👍👍👍
For almost two years now I've been saying that doing AI using digital was completely broken. A neuron is an op amplifier, and what we're doing using SIMD to multiply, accumulate, then apply an activation function is just a super-expensive emulation of the op-amp. I just don't know how fast we could make op-amp work at the node technology used by CPUs , it might be possible that digital remains faster but I strongly doubt it. I'm still waiting for an analog AI chip.
An interesting topic. I wonder when this will become accessible for regular consumers. Computers and gadgets using this technology will need to be practically rebuilt from the ground up and will make everything we've had so far pale in comparison. A major technological (r)evolution!
Question: if you have random bits, don't you still have to figure out an efficient way to read them out for computational use? Are the "computations" random and embedded into the memory somehow? This part was confusing to me.
It's an interesting idea, but does the energy efficiency figures include the energy required to cool the superconductors?
A computer with access to true randomness would be very interesting for generating unique encryption keys.
Who knew Arwen understood computers so well!
this will help to have sound being generated with pure analog output from.a computer since the digital nature of nowadays computers restrict a lot how synthesis output in a signal... exciting concept!
The probabilistic computation will run into is the issue of local minima when parallel calculations are running. This issue is equivalent to the physical phenomenon of density fluctuations near a critical point. So local solutions (maximum entropy) are strongly influenced by nearby minima in entropy.
This sounds like the hardware bridge needed for an efficient interface with a quantum computer. Combine the two and you would have some insane computing power to say the least. 😮❤
By embracing chaos, how do they expect to get correct answers? Computers are about input and output. You know the output is correct if you do it by hand. But if you don't know the answer already, how do you know what the chaos computer is outputting is correct?
@ 11:51 minutes ++ ; this looks very similar to a sculptor bringing out any required form or shape from a block of stone / granite, etc. by chipping off the unwanted parts of the stone for the final shape, or like a potter's mud which can be turned into any object by carving out the object required.
FOR THE RECORD: I stated this equation on X a few months ago. On some quantum computing recruitment thread.
I wonder if you can use radomness of electron tunneling to generate a random value, perhaps you can use an addressable analog voltage gate memory cell to tune the probability of tunneling. Then it can just use normal cmos logic gates for computation.
This does seem like the practical bridge to full quantum computing. Using a naturalistic framework to figure out natural phenomena.
WONDERFUL! This is EXACTLY what we need! My god. What I could do with that and an ML weighting table! Gosh!
And I love to see you having "your world turned upside down". This is _exactly_ what I have seen over and over every time I teach a coder to be a prompter. "NO! Stop trying to make it act like a computer! SURF THE NON-DETERMINISM! Make it _work_ for you." Fantastic.
"I this house, we obey the laws of thermodynamics!" Homer Simpson.
I'm getting Hitchhikers Guide to the Galaxy vibes here....
So would I be right in thinking probabilistic computing could also be similar to what a "Galton Board" does then?
Do you think that if AI models were run on probabalistic computers it would make another breakthrough and achieve super intelligence?
So, I guess next is Heart of Gold and the the Infinite Improbability Drive
Спасибо Анастасия. Очень познавательные видео.