Before coming here, i saw about 5 videos on SOM. No one pointed out that the algorithm is the same as K-means . You enlightened me! thank you very much
My humble regards to Professor. For so much simplifying complex concepts and explaining intution behind the algorithms ...and encouraging us to understand 🙏
Wonderful professor. I can follow with him even if i am so far from ML field. I start to love Mr hamid and also AI methods and techniques. Thanks a lot my favorite virtual teacher.
I like the way he explains things very clearly. Within machine learning there is a tendency to cloud things to make oneself seem more intelligent - this lecturer shows how simple some of these algorithms (and ML in general) truly are without dumbing things down.
SOM 40:39 1:14:25 Given input X, find i-th unit with closest weight vector by competition. WiT X will be maximum. Find the most similar unit. i(X) = arg max i Ⅱ X - Wk Ⅱ k = 1, 2, 3... m, m = no. of units. The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.
Given input X, find i-th unit with closest weight vector by competition. WiT X will be maximum. Find the most similar unit. i(X) = arg max i Ⅱ X - Wk Ⅱ k = 1, 2, 3... m, m = no. of units. The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.
Intro to Clustering 0:27
K-Means: 11:17
SOM: 42:00
Before coming here, i saw about 5 videos on SOM. No one pointed out that the algorithm is the same as K-means . You enlightened me! thank you very much
Absolutely brilliant. Clear, concise, flowing and enlightening. A great help in understanding SOFMs research I am thinking about doing. 👍
42:00 starts to talk about SOM
Thank you man
Thank you!
My humble regards to Professor. For so much simplifying complex concepts and explaining intution behind the algorithms ...and encouraging us to understand 🙏
this is the best explanation i have ever found. Please is there any way i can see more lectures from this professor in any other channels?
I am really proud of you! You explained SOM like an exciting journey...
Wonderful professor. I can follow with him even if i am so far from ML field. I start to love Mr hamid and also AI methods and techniques. Thanks a lot my favorite virtual teacher.
So good, loved the enthusiasm and A-class white board usage of the lecturer. Thank you so much for sharing.
Excellent Lecture about clustering. Thank you very much for sharing your knowledge.
I like the way he explains things very clearly. Within machine learning there is a tendency to cloud things to make oneself seem more intelligent - this lecturer shows how simple some of these algorithms (and ML in general) truly are without dumbing things down.
Amaaaaazing teaching skills!
SOM 40:39
1:14:25
Given input X, find i-th unit with closest weight vector by competition.
WiT X will be maximum.
Find the most similar unit.
i(X) = arg max i Ⅱ X - Wk Ⅱ
k = 1, 2, 3... m, m = no. of units.
The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.
Sir, you are a lifesaver sir. Thank you very much.
Given input X, find i-th unit with closest weight vector by competition.
WiT X will be maximum.
Find the most similar unit.
i(X) = arg max i Ⅱ X - Wk Ⅱ
k = 1, 2, 3... m, m = no. of units.
The "max" here means highest value of dot product. The most "aligned" set of vectors between input vector and the neuron vector. If the vector are misaligned, the dot product (think cos θ) might be zero.
Let me take a moment to admire your handwriting :) Plus "You are becoming your data", this has to be a dialogue from an AI movie. Cheers :)
Very neat handwriting for a professor.
SOM starts at 40:42
life saver
Professor says, "Nobody screams when I make mistakes". I went crazy on monitor nobody hears me :D 28:32
Thanks for sharing the amazing lecture. I wish to take a class in real someday
Very good lecture. Thanks for sharing.
does anyone know implementation of SOM in python 3?
in all the code I have seen they always use targets but we don't have it what should we do?
you could check Peter Wittek's somoclu library
great explanation
perfect ...thanks
why people are taking Mahalano distance , then eculidian distance ?
Who is the other scientist who has got new ideas?
"Norbert Wiener and Albert Einstein"
lecture notes?