This is an interesting research field I did not know the existence about up until now. I propose follow-up videos from Dr Phil Moriarty because this is too interesting!
It's so unconventional by current standards and yet when you think of first principles I think we see it's a welcome addition that should help cover a slew of problems that are ill-addressed by current systems.
@@paulleimer1218 Now that it's experimental, yes, although you can already find applicable problems in e.g. discrete maths. But if/when it becomes a real market, then re-thinking computation knowledge outside Von N. machines is likely to trigger whole new classes of common pipelines such as compilers or notions of Big-O complexity. It's bound to become a particular sub-domain within CS eventually if neuromorphic computing takes ahold. And I haven't even touched at the network science which is currently perhaps the poorest aspect in all of applied CS, being largely dealt with in stochastic rather than formally true paradigms (e.g. routing, still relying on tables at the end of the day, not pure logic).
This is really fascinating, but the parallels to neural circuitry were extremely overstated here. In an actual biological synapse, there's a giant division of varied proteins at work that does much more than simply tune the 'strength' of a connection in a straightforward way. Also: Single synapses often employ multiple transmitters; receiving synapses have differing receptors that respond differently to the same transmitter; synapses give themselves feedback about their own activity; different types of neurons have different patterns of baseline-firings in addition to action potentials etc. In more general terms: Nothing beyond the most primitive neural circuits is comprehensively and computationally 'understood'. Even though Prof. Moriarty talks about the lack of division between memory and 'processing' in humans, this is not exactly correct (there are dedicated 'memory areas' whose destruction or impairment has grave consequences for information storage and the capacity to learn, but they are not by themselves necessary or sufficient for the acquisition or recall of every type of memory). Nobody, in fact, can rigorously characterize where memory 'is' localized in the brain. But again, it is bound to be a combination of macroscopic regions, local synaptic information, temporal firing patterns etc. An informative video still, and I of course understand why these details and others were glossed over. But taking the neural analogy too serious, is kind of a bad idea in my opinion (at the current level of knowledge).
I was cringing as he described short-term plasticity and long-term potentiation. His blog post is also full of strange statements like "neurons combine (co-locate) memory and processing". Which is not actually true. That sort of question goes back to MuCulloch and Pitts who showed you actually have structures of connected neurons which can do logical processing together, not individually. The synaptic weighting between these units serving as memory of the system (Freeman).
I think that the implication is not that these devices exactly mimic the behavior of the brain, but that they can be "trained" as typical neural networks can in order to more closely mimic that. I would imagine during the process of training that there would indeed be separation and grouping of these small neuron-alikes to form larger structures. I do however have some major issues with CMOS being called energy hungry. It's only energy hungry because it can be, not because it has to be. Right now computing is based on flat chips with a limited surface area and a fixed cost, and one thing you can do to increase the computing power is to drive the circuits faster. Frequency is a large part of power consumption of chips, and part of the reason that phones are so power efficient is that they have multiple cores which run at 1 Ghz or less. GPUs are actually fantastically power efficient for their raw amount of performance because they have huge numbers of cores working in parallel at lower frequencies. Now imagine if you could fabricate an entirely integrated CPU with modern transistor density, not only in 2D, but in the full 3D volume equivalent of a human brain, and then have the coding to utilize that many parallel processors. You could drive them at 1 Mhz or less and still crush most other devices in highly parallel tasks. However, right now, that technology would be horrendously expensive and maybe even impossible with our technological progression. It will probably require nanoscale 3D printing to achieve anything close to that, which is fairly similar how brains are formed (in a broad sense).
"Memory" can mean several things, for example the memory of a event in the brain, which could be something like "data memory". Or it can mean: you remember how to drum, walk, talk, etc. This is more like "program memory". I think that "neurons combine (co-locate) memory and processing" refers to this "program memory"... the algoritms used to process data. Indeed is more like proccessing and memory are the same thing in neural nets. On the other hand, neural nets also store "data memory" as they are pattern recognition devices.
It’s almost as if he’s doing a 13 minute video and can’t completely explain a whole scientific field! What’s more he specifically says in the video that he will be leaving stuff out. Then absolute geniuses turn up in the chat to try to prove how smart they are by listing all the things he didn’t explain. Absolutely cringe.
A highly caffeinated Professor Moriarty is a joy to behold. He was going 100 miles a minute, stopped on a dime to answer you and then took off again without missing a beat.
I love this guy's energy and ability to explain such a vast and deep field to a common layman with no prior knowledge of these things, and without math at that. Amazing, please keep these videos going and give Phil more coffee 😂
I love listening to people talk about something they're passionate about... And I can see that he's passionate... I love this... I love learning from people like this...
Dr Moriarty is usually pretty passionate and energetic, but he's totally wired in this one! Caffeine is a hellofa drug. Love his passion for the field and his eloquence in breaking down complex topics.
Now that NPUs are becoming mainstream and available in laptops/smartphones/GPUs, this video deserves an update because there's like ZERO videos other than this that describe how NPUs actually work. Prior to this all I knew is that somehow they leveraged Kirchoff's law, but seeing that memristor diagram put it into perspective.
That Zeppelin shirt is fantastic! More videos with Phil!!! Also, nice to see info about the electrical hysteresis loop. Should make a computerphile vid on the history of the Law of Hysteresis discovered by Charles Proteus Steinmetz. It's EXTREMELY computing-relevant.
Wow! Never came across such a fascinating research field , my mind is totally blown away. The way Dr Phil Moriarty explaining shows his passion and enthusiasm towards this field, thank you Dr Phil. What I observed is generally academia doesn't support such a path-breaking research, we need more private industrial research so that at least we can try and experiment such unique ideas and they won't have to worry about lack of funds. If possible please make more videos on this topic, thanks again for making such an awesome informative video :)
Moriarty preaching about Neuromorphic computing gives me great hope, as I'm a physics student but I love concepts like these and am glad to know studying physics won't necessarily keep me away!
please sequel! all those juicy materials too, Transition metal dichalcogenides (TMDC), OxRAM, CBRAM, Ferroelectric systems, floating gate transistors. then one more ep on short term plasticity, long term memory consolidation, Prof JJ Yang's short term plasticity temporal coding paper. mind blown.
one additional aspect of the memristors ability to remember its past states via the vacancies in the synapses of the board, is the toffoli gate itself. Each input neuron and each output neuron on the board have their own ancilla bits inside their toffoli gate, which means its actual qubit that can help remember the last state of the synapse
It’s fascinating that the more we mimic nature with our technology the more advanced our technology gets, it’s almost as though nature itself is its own form of highly sophisticated technology.
Since 2017, I was a senior in Highschool, I have been trying to find some stuff related to the nervous system that can make a huge improvement for the management of information and conductivity of the brain. The answer is neuromorphic computing + memristors. I have been researching about the neuromorphic computing since that year and it's amazing the potential it has. Ofc, a lot of things are added with a lot of techniques. Brain machine interfaces non-invasive devices are gonna be the future of humanity in my opinion. This "artificial synapses" could be done by simulating the Central Nervous System (CNS) and somehow make the electrodes the memristors. And the question is how. One of my suggestions is using ultra sound waves controlled by EMG signals. Ultra sound waves can be change in direction and ofc the signal would not be perfect bcs it would be out of the skull the EMG device but there are electronic devices that can correct this with the help of AI. It's not easy bcs each neuron has a different behavior, work and location. So, it would be a huge thing to work it out in a neural network. Currently I'm doing my major in Biomedical Engineering but I know I'm a newbie on these type of things but I have an extreme interest on this neuromorphic computing.
Indeed, memristor technology is actually the product of a bootstrap paradox in which the technology was discovered from the remnants of a murderous robot from the future.
I have applied to work under a professor on this topic.... Hopefully I'll get into this amazing world. This is an amazing field... Thq for uploading this video
1st time I seen this guy.. What a legend the way he explains things laymens term is awesome.. Someone buy this man a coffee... I want to learn more from him..
Regular activation patterns lower the threshold, so your brain can compute after. Constant firings slow down the neurons to avoid oversensation, like when staring at a bright light, it gets darker. It's also down to chemical resources in the brain and that being "smart" is being energy(chemical) efficient.
Incredible video! Thank you very much! I was working in TiOx memristors at the University of Southampton and their potential is unbelievable! Shame they need a massive investment in order for them to be fabricated! Thank you guys! Brought me incredible memories!
This assumption of colocating storage and computation was actually omnipresent in my PhD. But at a larger scale. On a distributed system, each computer both have storage and computation abilities.
Nice video as usual with all these nothingam's channels :) so YES PLEASE MOAR on this ! I'm in Switzerland next to EPFL and i've heard about the Blue Brain Project but i didn't think others were so close to developping a memoristor
yikes, thank you so much for bringing this more into the spotlight! I did research on memristors during grad school, neuromorphic computing, the physical side of deep learning, really is the next big thing.
I haven't read the paper, and I'm probably not going to it's way out of my field tbh, but I feel like an easier way to do this without requiring the physical movement of ions is to instead use something which changes either the electron configuration of D and F orbitals to change the conductivity or magnetic properties of the ions in the current path. So you could hypothetically have say an manganese atom, and so by sending it a little bit of energy (below the first ionization energy of the Mn(II) ion) you could read the resistance and see which state it's in. But if you increase the current you'll slowly start stripping electrons from it. So it starts at Mn(II) and each memory state is an electron configuration up to Mn(VI), giving 6 distinct memory states. These being Mn(II), Mn(III), Mn(IV), Mn(V) and Mn(VI), but since computers use a binary you could use Mn(II) and then whichever other ion best fits the currents and energies being used. So again when you go to read it you get different resistances based on the ions state. And the best part of this is that the state is easily re-writable, by increasing the energy to write the ion to its Mn(VII) state, you can then take advantage of the comproportionation reaction of Mn(II) and Mn(VII) plus a few electrons to reset all of the Mn ions into the 2+ state. I'm a biochemist, who's done some inorganic chemistry and I'm not sure this will work, i have no idea about the chemistry of computer chips. But I'm basing it on the water-splitting-complex from photosynthesis which kind of does this.
Love this!!! Just my vote, but please do more on Neuromorphic Computing. No CS or CE background, just an interested fan of the field, and I’ve long been interested in this subject. Thanks!
It sounds like the system designs a hardware algorithm for every "experience." And our lifelong collection of these algorithms is what's recognized as intelligence.
Initially I just thought it was a triple shot espresso that you had bought him but then when I began to understand the concept that was being explained to me I saw the real source of his great excitement. 😵
I like how this field of research is a blend of chemistry, biology, electronic engineering and computer science
And later, when all this nonsense has stopped, perhaps we'll dig it up some day adding Archaeology to that list.
The future is more interdisciplinary and I like that =P
if you go down enough, biology becomes chemistry and if you go down even more, everything becomes physics.
@@Biped it's all applied math
@@rewrose2838 true. does that mean that being a mathematician qualifies you to be a doctor? :D
This is an interesting research field I did not know the existence about up until now. I propose follow-up videos from Dr Phil Moriarty because this is too interesting!
Lex Fridman recently had a very interesting interview on this topic if you want something longer to listen to
Dr. Phil Moriarty sounds like a fictional character from a Japanese comic book in English...
@@phs125 He is a grand nephew of James Moriarty (if you know what I mean ;)
@@Matthew-kn7xi Thank you, I will definitely watch the video today!
I hope Dr. Phil Moriarty uses more Rush examples in the follow up videos
I'd love to hear more about this, on one condition: we get this man more coffee
And a helmet to protect his skull from the over-excited motor function. Although I guess he'll be first in line for a neuromorphic replacement.
I could listen to him talk all day.
It wasn´t coffee, It was Weed !!
yeah. couldn't find a course on UA-cam or anything
@@ivanguerra1260 When I drink coffee, people think I'm high on speed
yes please go into more detail about neuromorphic computing
This is one of the most interesting, weird and cool areas of computer science in my opinion
It's so unconventional by current standards and yet when you think of first principles I think we see it's a welcome addition that should help cover a slew of problems that are ill-addressed by current systems.
This is legit the future!
Its really an area of computer engineering
@@paulleimer1218 Now that it's experimental, yes, although you can already find applicable problems in e.g. discrete maths. But if/when it becomes a real market, then re-thinking computation knowledge outside Von N. machines is likely to trigger whole new classes of common pipelines such as compilers or notions of Big-O complexity. It's bound to become a particular sub-domain within CS eventually if neuromorphic computing takes ahold. And I haven't even touched at the network science which is currently perhaps the poorest aspect in all of applied CS, being largely dealt with in stochastic rather than formally true paradigms (e.g. routing, still relying on tables at the end of the day, not pure logic).
This is physics, not CS
This is really fascinating, but the parallels to neural circuitry were extremely overstated here. In an actual biological synapse, there's a giant division of varied proteins at work that does much more than simply tune the 'strength' of a connection in a straightforward way. Also: Single synapses often employ multiple transmitters; receiving synapses have differing receptors that respond differently to the same transmitter; synapses give themselves feedback about their own activity; different types of neurons have different patterns of baseline-firings in addition to action potentials etc.
In more general terms: Nothing beyond the most primitive neural circuits is comprehensively and computationally 'understood'. Even though Prof. Moriarty talks about the lack of division between memory and 'processing' in humans, this is not exactly correct (there are dedicated 'memory areas' whose destruction or impairment has grave consequences for information storage and the capacity to learn, but they are not by themselves necessary or sufficient for the acquisition or recall of every type of memory). Nobody, in fact, can rigorously characterize where memory 'is' localized in the brain. But again, it is bound to be a combination of macroscopic regions, local synaptic information, temporal firing patterns etc.
An informative video still, and I of course understand why these details and others were glossed over. But taking the neural analogy too serious, is kind of a bad idea in my opinion (at the current level of knowledge).
I was cringing as he described short-term plasticity and long-term potentiation. His blog post is also full of strange statements like "neurons combine (co-locate) memory and processing". Which is not actually true. That sort of question goes back to MuCulloch and Pitts who showed you actually have structures of connected neurons which can do logical processing together, not individually. The synaptic weighting between these units serving as memory of the system (Freeman).
I think that the implication is not that these devices exactly mimic the behavior of the brain, but that they can be "trained" as typical neural networks can in order to more closely mimic that. I would imagine during the process of training that there would indeed be separation and grouping of these small neuron-alikes to form larger structures. I do however have some major issues with CMOS being called energy hungry. It's only energy hungry because it can be, not because it has to be. Right now computing is based on flat chips with a limited surface area and a fixed cost, and one thing you can do to increase the computing power is to drive the circuits faster. Frequency is a large part of power consumption of chips, and part of the reason that phones are so power efficient is that they have multiple cores which run at 1 Ghz or less. GPUs are actually fantastically power efficient for their raw amount of performance because they have huge numbers of cores working in parallel at lower frequencies. Now imagine if you could fabricate an entirely integrated CPU with modern transistor density, not only in 2D, but in the full 3D volume equivalent of a human brain, and then have the coding to utilize that many parallel processors. You could drive them at 1 Mhz or less and still crush most other devices in highly parallel tasks. However, right now, that technology would be horrendously expensive and maybe even impossible with our technological progression. It will probably require nanoscale 3D printing to achieve anything close to that, which is fairly similar how brains are formed (in a broad sense).
"Memory" can mean several things, for example the memory of a event in the brain, which could be something like "data memory".
Or it can mean: you remember how to drum, walk, talk, etc. This is more like "program memory".
I think that "neurons combine (co-locate) memory and processing" refers to this "program memory"... the algoritms used to process data.
Indeed is more like proccessing and memory are the same thing in neural nets.
On the other hand, neural nets also store "data memory" as they are pattern recognition devices.
It's all about Spheroidal Cows...
It’s almost as if he’s doing a 13 minute video and can’t completely explain a whole scientific field! What’s more he specifically says in the video that he will be leaving stuff out. Then absolute geniuses turn up in the chat to try to prove how smart they are by listing all the things he didn’t explain. Absolutely cringe.
A highly caffeinated Professor Moriarty is a joy to behold. He was going 100 miles a minute, stopped on a dime to answer you and then took off again without missing a beat.
If you have the opportunity, then it would be awesome with more videos on this topic in the future.
I love this guy's energy and ability to explain such a vast and deep field to a common layman with no prior knowledge of these things, and without math at that. Amazing, please keep these videos going and give Phil more coffee 😂
Ray kurzweil predicted that we will achieve AGI by 2029, I think he is right
Yes! Please do more videos about it :)
Vote +1 very interested
I love listening to people talk about something they're passionate about... And I can see that he's passionate... I love this... I love learning from people like this...
Drumming might not have been the best analogy. Drummers are usually working with fewer synapses :)
Is this a good musical joke or is this a sick burn ?
Haake, T., Garstka, M. et al. would like a word with you
The drummer might have locked his keys in the car, but it's the bass player that's stuck inside. 🔥
Dr Moriarty is usually pretty passionate and energetic, but he's totally wired in this one! Caffeine is a hellofa drug. Love his passion for the field and his eloquence in breaking down complex topics.
Now that NPUs are becoming mainstream and available in laptops/smartphones/GPUs, this video deserves an update because there's like ZERO videos other than this that describe how NPUs actually work. Prior to this all I knew is that somehow they leveraged Kirchoff's law, but seeing that memristor diagram put it into perspective.
He is so enthusiastic about his research! I ❤️ his energy.
I'd love more videos about this field! Super interesting and love the enthusiasm this guy has!
I could listen to this guy all day has such enthusiasm
GIVE US MORE OF THIS! Seriously this is a fascinating area of research. I'd love to see many more videos about this topic.
Ok this was AMAZINGLY interesting and well explained. Please we need more on this field !
What about FPGAs?
This man is so excited, it's like he's on the verge of a breakthrough. Love his enthusiasm!
I always trust a man who's this excited and who likes John Bonham.
I had to watch the first two minutes twice, because I was geeking out on his shirt.
I've been working on simulating memristors as artificial synapses and I can really feel this man excitement
People like that, approach and hyper passion like that, is what gives me chills. This stuff is fascinating. Thank you so much.
That Zeppelin shirt is fantastic! More videos with Phil!!! Also, nice to see info about the electrical hysteresis loop. Should make a computerphile vid on the history of the Law of Hysteresis discovered by Charles Proteus Steinmetz. It's EXTREMELY computing-relevant.
This is the Dr. Phil I would rather watch.
The Dr Phil actually worth watching.
Memristors are useful for standard computing also so theres multiple funding reasons for their development
Wow! Never came across such a fascinating research field , my mind is totally blown away. The way Dr Phil Moriarty explaining shows his passion and enthusiasm towards this field, thank you Dr Phil. What I observed is generally academia doesn't support such a path-breaking research, we need more private industrial research so that at least we can try and experiment such unique ideas and they won't have to worry about lack of funds. If possible please make more videos on this topic, thanks again for making such an awesome informative video :)
As a material science masters I’m writing my thesis on neuromorphic computing this year. I’m insanely excited to say the least.
Yes.. Yes please do more of these
The fact that we can (potentially) mimic the function of a biological, wet, neurone using solid state matter is absolutely mind blowing
Moriarty preaching about Neuromorphic computing gives me great hope, as I'm a physics student but I love concepts like these and am glad to know studying physics won't necessarily keep me away!
I would definitely watch a series on this topic
Thanks for decomposing the memristor to its bare essence.
I always love Dr. Moriarty's energy and delivery.
Very well presented and, as always, Dr Moriarty's infectious enthusiasm made me want to buy one (and I don't even know what it is)
As a second year physicist, this really helps me decide where to go and what to do...computer brains it is!
I'd love if this could be a series. Also the way he introduce papers that might be interesting to check out
Brady/Sean we need more videos on neuromorphic computing!!
Beautiful explanation. Imagine the fulfillment that engulfs researchers in this field; it must be intoxicating.😀
Dude, I have been waiting SOO long for someone to talk about neuromorphic computing on UA-cam.
please sequel! all those juicy materials too, Transition metal dichalcogenides (TMDC), OxRAM, CBRAM, Ferroelectric systems, floating gate transistors. then one more ep on short term plasticity, long term memory consolidation, Prof JJ Yang's short term plasticity temporal coding paper. mind blown.
Interesting field of research that I didn't even know existed. Would very much love to see more content on this!
one additional aspect of the memristors ability to remember its past states via the vacancies in the synapses of the board, is the toffoli gate itself.
Each input neuron and each output neuron on the board have their own ancilla bits inside their toffoli gate, which means its actual qubit that can help remember the last state of the synapse
It’s fascinating that the more we mimic nature with our technology the more advanced our technology gets, it’s almost as though nature itself is its own form of highly sophisticated technology.
I was absolutely engrossed in this video because of the content... I need more... PLEASE??
The day we fully understand our brain will be the day we'll have the biggest jump in how to correctly design better computational architecture
I can feel Dr's Phil energy. It's Amazing.
Please do more videos on this topic!!!
Wow! We need more of Dr. Moriarty!
I love the enthusiasm of Dr Phil Moriarty! Inspiring! Keep on producing such awesome work!
This will have a big impact on everything.
Since 2017, I was a senior in Highschool, I have been trying to find some stuff related to the nervous system that can make a huge improvement for the management of information and conductivity of the brain. The answer is neuromorphic computing + memristors. I have been researching about the neuromorphic computing since that year and it's amazing the potential it has.
Ofc, a lot of things are added with a lot of techniques. Brain machine interfaces non-invasive devices are gonna be the future of humanity in my opinion. This "artificial synapses" could be done by simulating the Central Nervous System (CNS) and somehow make the electrodes the memristors. And the question is how. One of my suggestions is using ultra sound waves controlled by EMG signals. Ultra sound waves can be change in direction and ofc the signal would not be perfect bcs it would be out of the skull the EMG device but there are electronic devices that can correct this with the help of AI.
It's not easy bcs each neuron has a different behavior, work and location. So, it would be a huge thing to work it out in a neural network.
Currently I'm doing my major in Biomedical Engineering but I know I'm a newbie on these type of things but I have an extreme interest on this neuromorphic computing.
Please, more content on this topic!
I just started graduate studies in computational neuroscience/machine learning! This is super cool and I want more videos!
This sounds like one of the most important topics covered here, this is amazing! Would love to see more of this.
Isn't this what renowned computer scientist Miles Dyson was working on way back in the day?
Indeed, memristor technology is actually the product of a bootstrap paradox in which the technology was discovered from the remnants of a murderous robot from the future.
@@mushin111 Someone should make a movie about that!
Hi guys definitely would like to hear more about this topic.
Thanks and keep up the excellent work
Super interesting stuff; would definitely be interested in more of this field.
Please Computerphile, do more of these video, this is a very exciting topic :)
I have applied to work under a professor on this topic.... Hopefully I'll get into this amazing world. This is an amazing field... Thq for uploading this video
where?
Goodluck!🥰 This is a fascinating topic to study!
I absolutely want more of this
Very much like how neurotransmitters change probability of a neuron action potential propagating...wow...this is amazing...
My favourite lecturer on youtube by far.
Maybe it's because I'm a guitarist and play some drums.
Luv and Peace.
1st time I seen this guy..
What a legend the way he explains things laymens term is awesome.. Someone buy this man a coffee... I want to learn more from him..
Please do more on this field. I would definitely watch that
Ok so this is equally fascinating and terrifying. Please give us more.
I’m currently doing my PhD in neuromorphic computing; it definitely is the way forwards.
Regular activation patterns lower the threshold, so your brain can compute after.
Constant firings slow down the neurons to avoid oversensation, like when staring at a bright light, it gets darker.
It's also down to chemical resources in the brain and that being "smart" is being energy(chemical) efficient.
10:20 I'm sure it's unvoluntary but I love how the box has the "Intel Inside ™" logo on the size and it's mimicking a memoristor ahahah
Neurons that fire together, wire together.
I definitely want to hear more about neuromorphic computing.
9:58 But this one goes to 11!
Definitely want more clips on this field!
Our capabilities with our brain are things that were trained for many days when we were toddlers.
That is absolutely fascinating. I really would like to know more about these neuromorphic architectures and how they work.
This makes so much sense. Amazing. Please make more!!
"You don't have a lot of liquid in a laptop"
Coffee has entered the chat.
Coffee is not exactly a liquid, but a mixture of various solutions and suspensions of solids.
Incredible video! Thank you very much!
I was working in TiOx memristors at the University of Southampton and their potential is unbelievable!
Shame they need a massive investment in order for them to be fabricated!
Thank you guys! Brought me incredible memories!
This assumption of colocating storage and computation was actually omnipresent in my PhD. But at a larger scale. On a distributed system, each computer both have storage and computation abilities.
Nice video as usual with all these nothingam's channels :) so YES PLEASE MOAR on this ! I'm in Switzerland next to EPFL and i've heard about the Blue Brain Project but i didn't think others were so close to developping a memoristor
yikes, thank you so much for bringing this more into the spotlight! I did research on memristors during grad school, neuromorphic computing, the physical side of deep learning, really is the next big thing.
Finally Neuromorphic computing is getting more recognition!!
So our consciousness is computational? confirmed? Also yes more videos please. You're one of the best channels on youtube.
Great work!! Please, keep uploading contents about this subject
I haven't read the paper, and I'm probably not going to it's way out of my field tbh, but I feel like an easier way to do this without requiring the physical movement of ions is to instead use something which changes either the electron configuration of D and F orbitals to change the conductivity or magnetic properties of the ions in the current path. So you could hypothetically have say an manganese atom, and so by sending it a little bit of energy (below the first ionization energy of the Mn(II) ion) you could read the resistance and see which state it's in. But if you increase the current you'll slowly start stripping electrons from it. So it starts at Mn(II) and each memory state is an electron configuration up to Mn(VI), giving 6 distinct memory states. These being Mn(II), Mn(III), Mn(IV), Mn(V) and Mn(VI), but since computers use a binary you could use Mn(II) and then whichever other ion best fits the currents and energies being used. So again when you go to read it you get different resistances based on the ions state. And the best part of this is that the state is easily re-writable, by increasing the energy to write the ion to its Mn(VII) state, you can then take advantage of the comproportionation reaction of Mn(II) and Mn(VII) plus a few electrons to reset all of the Mn ions into the 2+ state. I'm a biochemist, who's done some inorganic chemistry and I'm not sure this will work, i have no idea about the chemistry of computer chips. But I'm basing it on the water-splitting-complex from photosynthesis which kind of does this.
Love this!!! Just my vote, but please do more on Neuromorphic Computing. No CS or CE background, just an interested fan of the field, and I’ve long been interested in this subject. Thanks!
It sounds like the system designs a hardware algorithm for every "experience." And our lifelong collection of these algorithms is what's recognized as intelligence.
Initially I just thought it was a triple shot espresso that you had bought him but then when I began to understand the concept that was being explained to me I saw the real source of his great excitement. 😵
The most valuable video i had watched in 2021, Really! i felt like i'm not wasting my time.
I’ve been a member of IEEE since 98 when I got BS in EET, and nanotechnology has been in the works for awhile now.
Phil reminds me so much of one of the monologues in the movie 'A Waking Life'
Yes! More videos on neuromorphic computing please!
Yes: please do more videos on this topic. Thank you.
Wait, is his name really Dr. Moriarty??!!! That’s incredible!
Yes. More please.
This is fascinating, please more.
I would love to see more videos on this.
I would definitely not mind more videos in this field!
More videos on this topic, PLEASE!