I am surprised by the less number of views. I reached your channel while trying to understand BERT and the way you explain is amazing. Someone who is just starting out can also understand your videos! Great content!!
Man .. you are awesome. I really love the way you start from the absolute basics and build it up from there. Can you also please make an additional video to show how can we use GPU while fitting models like BERTopic to the data.
Not so much difference except in implementation. Both have input, output, and connection weights. The difference is that the connection weight in the artifical is a number whilst in the biological is actually how strong the neural cells are connected together. So you see... the neuron alone is actually very useless, since the main ingredient of intelligence is how "many" neurons are all connected. It is the connections between them that make the magic happen. People who think all connections between the neurons in your brain should be stronger definitely don't understand how intellgience work. If all of your neurons' connections are very strong, you'd lose all sense of consciousness, memories, and awareness and just become obselete. So if the neuron itself is so useless like that and only the connections that make most of the magic, then what even is the point of a neuron cell? Glad you asked! In computer science, every neuron is represented as an activation function that all input connections and outputs to multiple other neurons. Every connection has a weight and a bias but let's ignore the bias for the sake of smiplicity. So if a connection is weak, let's say 0.1, and a strong connection has a value of 9.4,. Remember, that's just the connection. Now when a neuron is activated, it signals all of its connected connections by a value, let's call it "signal" and that is 42. Signal gets transmitted through the first and second connections to two different neurons: N0_1 --> N1_0: 42*0.1 = 4.2 strength --> N2_0: 42*9.4 = 394.8 strength It's clearer that neuron 2_0 likes to get signals from neuron 0_1 more! So it get activated more. But that's not all, a neuron function as we said is to sum all the connections, in this example it was just 1 connection so really there isn't much to sum there except 1 but I didn't write it here cause it becomes complicated really fast to do in a UA-cam comment. When a neuron is activated, it only not sums but also puts the result through an activation function which maps the result into a useful region of signals. Or it can just not use negative numbers. How does a neuron learn? It compares the final result with the supposedly predicted result or expected behavior (in a biological system, if the result was bad, you feel pain either physically because it hurted your survival or emotionally) so neurons adjust themselves towards the values that gives positive results by strongifiyng the connections that gave positive results and weakining the connections that gave bad results; creating intellgient behavior at the end of the process. In a biological system, that's automatically done for you because stronger connections get more cells, and the weak connections, their cells die after some time. So as you can see, neural biological systems aren't very efficient because they rely on the cells of weak connections dying out eventually which is a long process so humans and every other animal don't learn as fast as an AI does.
Not so much difference except in implementation. Both have input (dendrites), output (axons), and the connection weights of those inputs and outputs. The difference is that the connection weight in the artifical is a number whilst in the biological is actually how strong the neural cells are connected together. So you see... the neuron alone is actually very useless, since the main ingredient of intelligence is how neurons are all connected. It is the connections between them that make the magic happen. For example, some people think that if all connections between the neurons in your brain are stronger, you're gonna have good memory or become very intelligent! No. That's not how it works. If all of your neurons' connections are very strong, you'd lose all sense of consciousness, memories, and awareness and just become obselete. It is like 0 and 1 in your hard drive, and you just 1'd everything efficiently killing all valuable data. So anyways, if the neuron itself doesn't hold any data like that and only the connections that make the magic, then what even is the point of a neuron cell? Glad you asked! In computer science, every neuron is represented as an activation function that sums up all input connections and outputs to multiple other neurons. Where every connection has a weight and a bias but let's ignore the bias for the sake of smiplicity. Thus if a connection is weak, let's say 0.1, and a strong connection has a value of 9.4,. Remember, these are just connections, not neurons themselves. Now when a neuron is activated, it signals all of its connected connections by a value, let's call it "signal" and that is 42. Signal gets transmitted through the first and second connections to two different neurons: N0_1 --> N1_0: 42*0.1 = 4.2 strength --> N2_0: 42*9.4 = 394.8 strength It's clearer that neuron 2_0 likes to get signals from neuron 0_1 more! So it get activated more. But that's not all, a neuron function as we said is to sum all the connections, in this example it was just 1 connection to every neuron so really there isn't much to sum there except 1. I just didn't write more here because you'd need n² of connections and it becomes complicated really fast to do in a UA-cam comment. When a neuron is activated, it only not sums but also puts the result through an activation function which maps the result into a useful region of signals. Or it can just not use negative numbers. How does a neuron learn? It compares the final result with the supposedly predicted result or expected behavior (in a biological system, if the result was bad, you feel pain either physically because it hurted your survival or emotionally) so neurons adjust themselves towards the values that gives positive results by strongifiyng the connections that gave positive results and weakining the connections that gave bad results; creating intellgient behavior at the end of the process. In a biological system, that's automatically done for you because stronger connections get more cells, and the weak connections, their cells die after some time. So as you can see, neural biological systems aren't very efficient because they rely on the cells of weak connections dying out eventually which is a long process so humans and every other animal don't learn as fast as an AI does. It is no secret that singularity will happen one day where an AI gets way more intelligent than the entiery of human race for their ability to learn really fast and for the exponential growth of technology. Doesn't mean it'll become hostile though.
You've got a lot of informative videos, with great visualizations... but in all of them you keep pronouncing Matrix and Matricies wrong! Except here... you accidentally said both correctly at 2:45! It was wild to hear it correct after wincing so many times.
He said every single ‘Matrix’ and ‘ Matrices’ correctly in this video. Are you sure you were not hearing ‘metrics’ which is a total different concept in machine learning?
I am surprised by the less number of views. I reached your channel while trying to understand BERT and the way you explain is amazing. Someone who is just starting out can also understand your videos! Great content!!
I appreciate you breaking down the inner working of the system for us simpletons. It makes things so much more interesting.
Watching at 1.25X works perfect. Awesome Content, especially ones on Bert and Transformers.
Yes, I enjoyed this tremendously!
Thanks for explaining this!
you explained it so clearly...thank you.... I will try to digest now...
Your content is Amazing.
your knowledge is outstanding
Really nice video man! Thanks a lot man!
You are very welcome! Thanks for watching !
Man .. you are awesome. I really love the way you start from the absolute basics and build it up from there.
Can you also please make an additional video to show how can we use GPU while fitting models like BERTopic to the data.
Resourceful thanks
thanks , would love to see video on comparison between k80 vs v100, p100, t4 :)
it did help me so much , Great job
I am super glad this helps :)
Very nice
Thanks for watching :)
By neural network do you mean like the human brain which is a neural network? Or an artificial one?
Not so much difference except in implementation. Both have input, output, and connection weights. The difference is that the connection weight in the artifical is a number whilst in the biological is actually how strong the neural cells are connected together. So you see... the neuron alone is actually very useless, since the main ingredient of intelligence is how "many" neurons are all connected. It is the connections between them that make the magic happen.
People who think all connections between the neurons in your brain should be stronger definitely don't understand how intellgience work. If all of your neurons' connections are very strong, you'd lose all sense of consciousness, memories, and awareness and just become obselete.
So if the neuron itself is so useless like that and only the connections that make most of the magic, then what even is the point of a neuron cell? Glad you asked!
In computer science, every neuron is represented as an activation function that all input connections and outputs to multiple other neurons.
Every connection has a weight and a bias but let's ignore the bias for the sake of smiplicity.
So if a connection is weak, let's say 0.1, and a strong connection has a value of 9.4,. Remember, that's just the connection.
Now when a neuron is activated, it signals all of its connected connections by a value, let's call it "signal" and that is 42.
Signal gets transmitted through the first and second connections to two different neurons:
N0_1
--> N1_0: 42*0.1 = 4.2 strength
--> N2_0: 42*9.4 = 394.8 strength
It's clearer that neuron 2_0 likes to get signals from neuron 0_1 more! So it get activated more.
But that's not all, a neuron function as we said is to sum all the connections, in this example it was just 1 connection so really there isn't much to sum there except 1 but I didn't write it here cause it becomes complicated really fast to do in a UA-cam comment.
When a neuron is activated, it only not sums but also puts the result through an activation function which maps the result into a useful region of signals. Or it can just not use negative numbers.
How does a neuron learn? It compares the final result with the supposedly predicted result or expected behavior (in a biological system, if the result was bad, you feel pain either physically because it hurted your survival or emotionally) so neurons adjust themselves towards the values that gives positive results by strongifiyng the connections that gave positive results and weakining the connections that gave bad results; creating intellgient behavior at the end of the process.
In a biological system, that's automatically done for you because stronger connections get more cells, and the weak connections, their cells die after some time. So as you can see, neural biological systems aren't very efficient because they rely on the cells of weak connections dying out eventually which is a long process so humans and every other animal don't learn as fast as an AI does.
Not so much difference except in implementation. Both have input (dendrites), output (axons), and the connection weights of those inputs and outputs. The difference is that the connection weight in the artifical is a number whilst in the biological is actually how strong the neural cells are connected together. So you see... the neuron alone is actually very useless, since the main ingredient of intelligence is how neurons are all connected. It is the connections between them that make the magic happen.
For example, some people think that if all connections between the neurons in your brain are stronger, you're gonna have good memory or become very intelligent! No. That's not how it works. If all of your neurons' connections are very strong, you'd lose all sense of consciousness, memories, and awareness and just become obselete. It is like 0 and 1 in your hard drive, and you just 1'd everything efficiently killing all valuable data.
So anyways, if the neuron itself doesn't hold any data like that and only the connections that make the magic, then what even is the point of a neuron cell? Glad you asked!
In computer science, every neuron is represented as an activation function that sums up all input connections and outputs to multiple other neurons. Where every connection has a weight and a bias but let's ignore the bias for the sake of smiplicity.
Thus if a connection is weak, let's say 0.1, and a strong connection has a value of 9.4,. Remember, these are just connections, not neurons themselves. Now when a neuron is activated, it signals all of its connected connections by a value, let's call it "signal" and that is 42.
Signal gets transmitted through the first and second connections to two different neurons:
N0_1
--> N1_0: 42*0.1 = 4.2 strength
--> N2_0: 42*9.4 = 394.8 strength
It's clearer that neuron 2_0 likes to get signals from neuron 0_1 more! So it get activated more.
But that's not all, a neuron function as we said is to sum all the connections, in this example it was just 1 connection to every neuron so really there isn't much to sum there except 1. I just didn't write more here because you'd need n² of connections and it becomes complicated really fast to do in a UA-cam comment.
When a neuron is activated, it only not sums but also puts the result through an activation function which maps the result into a useful region of signals. Or it can just not use negative numbers.
How does a neuron learn? It compares the final result with the supposedly predicted result or expected behavior (in a biological system, if the result was bad, you feel pain either physically because it hurted your survival or emotionally) so neurons adjust themselves towards the values that gives positive results by strongifiyng the connections that gave positive results and weakining the connections that gave bad results; creating intellgient behavior at the end of the process.
In a biological system, that's automatically done for you because stronger connections get more cells, and the weak connections, their cells die after some time. So as you can see, neural biological systems aren't very efficient because they rely on the cells of weak connections dying out eventually which is a long process so humans and every other animal don't learn as fast as an AI does.
It is no secret that singularity will happen one day where an AI gets way more intelligent than the entiery of human race for their ability to learn really fast and for the exponential growth of technology. Doesn't mean it'll become hostile though.
dose NumPy lib use GPU?
nope
neither does, or ever will sci0kit learn use it.
google CuPy, thats numpy with cuda
So, I can multiply any matrix size; of course, time will be an issue.
You've got a lot of informative videos, with great visualizations... but in all of them you keep pronouncing Matrix and Matricies wrong! Except here... you accidentally said both correctly at 2:45! It was wild to hear it correct after wincing so many times.
He said every single ‘Matrix’ and ‘ Matrices’ correctly in this video. Are you sure you were not hearing ‘metrics’ which is a total different concept in machine learning?