Principle Component Analysis (PCA) | Part 2 | Problem Formulation and Step by Step Solution
Вставка
- Опубліковано 7 лип 2024
- This video breaks down the problem formulation and offers a step-by-step solution guide. Enhance your understanding of PCA and master the techniques for dimensionality reduction in your data.
Code used: github.com/campusx-official/1...
About Eigen Vectors:
www.visiondummy.com/2014/04/g....
• Eigenvectors and eigen...
Plotting tool used:
www.geogebra.org/m/YCZa8TAH
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Practical Example on MNIST Dataset
00:33 - Problem Formulation
12:55 - Covariance and Covariance Matrix
23:17 - Eigen Vectors and Eigen Values
25:37 - Visualizing Linear Trasnformations
35:35 - Eigendecompostion of a covariance Matrix
38:04 - How to solve PCA
43:41 - How to transform points?
48:18 - Code Demo with Vizualization
56:00 - Outro
गुरुर्ब्रह्मा ग्रुरुर्विष्णुः गुरुर्देवो महेश्वरः।
गुरुः साक्षात् परं ब्रह्म तस्मै श्री गुरवे नमः।।
you are the real teacher sir.. 💫
This is hands down the best PCA explanation I have seen on the internet. Period!
True
indeed
I have studied eigenvalues and eigenvectors, multiple times, but this video explained to me the depth of it in a very simple way! One of the best teachers out there.
Feature extraction perfect example is u sir
High dimension knowledge ko 2D me convert krk smjhne layak bana dete ho..
Thankq so much sir.
I am always amazed how important the concept of eigenvectors and eigenvalues are, they are one of the most important concepts of quantum mechanics. Every operator( ex- energy, momentum) in Q.Mech is a linear operator and our aim usually is to find the corresponding eigenvectors and eigenvalues. Time-independent Schrödinger eq usually takes the form of eigenvalue equation Hψ =Eψ. It's so amazing to see how these concepts are finding their role in Machine Learning as well. MY love for Math keeps on growing. As always thank you for your amazing videos
Omg in one video he explains the most difficult linear algebra topic and applied it to machine learning and also showed us the code. Hats off
its a video of 3 parts and in a way he is explaining dum man can understand the concept
No one has ever explained Eigen Vectors in such a simple way, You are awesome !!
You are simply amazing. I bet no one can teach PCA so well.
Honestly everything I know, I owe it to you, thankyou for being the real HERO in the need!!!
this channel is so underated,by seeing this pca video everyone one can understand dimensonality reduction,thank you sir for the hard work
गुरू ब्रह्मा गुरू विष्णु, गुरु देवो महेश्वरा
गुरु साक्षात परब्रह्म, तस्मै श्री गुरुवे नमः
Aapko koti koti naman hai sir, jo ye gyan aapne youtube ke jariye hum sabko diya
The Epic PCA explaination ever seen on UA-cam and never done by any DS UA-camr Yet, Hat's off to your teaching skills sir
I am blown away by understanding the true meaning of eigen vectors. I always knew the definition but now I have understood the meaning. You are a savior!
Actually I was learning PCA for the first time. When I watched the video for the first time I didn't understand it but when I watched it a second time then all the topics very clearly. This video is amazing
Sir No one in UA-cam taught this much intutive. Even paid course cant teach this much indepth
Thankyou So Much Sir, You not only Cleared my doubt's about how PCA works, but also for the first time gave me mathematical intitution of Eigen Value and Eigen Vector and even Matrices Transformation which I am learning from previous so many years
Best Explaination I've seen regarding this topic
The most satisfying Machine Learning lecture that I've ever seen by far🤩🤩
45:44 I think it's gonna be (3,1) and when transposed it's gonna be (1,3) which then is multiplied with the matrix representing the dataset. (1,3) × (3, 1000) . This representation is valid too
My head hurts, this is sooo descriptive and apt and worthy enough of all the time. best. Kuddos
You are really a good teacher, I am in IIT Bombay, Environmental Engineering, Mtech , but I wanted to learn ML, this playlist is so far best understandable for me.
Bhai isse acha private se CS krliya hota
Lmao@@shubhamagrahari9745
Best Teaching Skills I have ever seen ,for All Machine Learning concepts, Hats of you Sir!🎉🎉🎊🎊
You have cleared my concept which was not well explained by any other instructor on youtube! Great job❤❤
One of the best videos I found for PCA. You have great skills brother.
Love from KARACHI ,PAKISTAN.
This is called Teaching ! Thanks for this wonderful explanation.
Maza aagaya bhai itna achha explanation sayad hi kisine youtube me kiya hoga.
One small explanation of shortcut in lecture at 16:04,
co-variance actual formula includes xmean and ymean, here both were zero, that's why shortcut sum(x*y)/3
formula for covariance is:
covariance(x,y) = summation[(x-xmean) (y-ymean)] / n
basically this is the same reason, covariance matrix has variance at diagonal 22:57
both features are same x
so covariance(x,x) = summation[(x-xmean)(x-xmean)]/n
which is actually the formula for variance
how can i thank you, what a wonderful teacher available for free for the help of many students
Thank you so much sir, you always leave us awestruck by your remarkable explanation and in-depth knowledge. I never knew this topic can be explained with this much clearity.
The teacher I never knew I needed in my life ❤️✨
This Channel is Goldmine 🙌
you have made a complex topic like PCA so easy for us to understand
Best explanation of PCA I've ever seen!! ❤️
what an amzing explanation very intutive am follwing ur whole series sir
Amazingly explained. 🤩👏🏻
This is the best video I have watched on this topic!
Wow best video on PCA on internet
Speechless .....you deserve a million subscribers at least
Natural Language Processing(NLP): ua-cam.com/play/PLKnIA16_RmvZo7fp5kkIth6nRTeQQsjfX.html
u r the best Nitish. Thanks for all these.
your content is best ever!! thank your sir!
I have seen and understand lin alg playlist at 3blue1 brown But you enhances my doubts clarifications even more thanks
those who are wondering why three eigen vectors everytime, because covariance matrix is a symmetric matrix, and Real Symmetric Matrices have n linearly independent and orthogonal eigenvectors. zero vector is not considered an eigen vector although it satisfies Ax= λx, like wise there might be upto n LI eigen vectors for n*n symmetrix matrix
sir ye dekh kar maja ko maja aaa gaya.
kya hi bolu sir ab .
PCA ka Best video tha ye.
Love you sir.
wow content ! you are playing a big role for me to make me as a data scientist .. thank you sir!
Hi pravin if u have got the job, could you guide me a little. I have questions related to how the work gets distributed in a data science dept. of a company. How the data science dept works and how the work gets distributed etc etc...Could u plz share ur email?
@@akshaypatil8155no 👎
Speechless ,too good to grasp
extremely superb explanation, kudooss
god bless you. wonderful session
Thanks sir for this amazing explanation
Awesome video, omg , u explained every concept so clearly. thx a lot sir
Amazing clarity !!!
Thank You sir.
such a crisp explaination....
Brilliant explanation! Thank you so much :)
very excellent explanation.....
Great step-by-step explanation
Thanks for the excellent video bro ,@16:21 in covariance we substrate values from mean and then multiply right
Great explanation @Nitish
Amazing Tutorial
Awesome 👏👍🏻
Thanks for explanations!
Excellent!!
you are a saviour to my sinking boat❣. thanks a lot.
was very useful for me, thank you :)
Excellent !!!
pretty good explanation of doing PCA computatinally without using the sklearn
Thank You Sir.
Justt soo awesome ! Cant describe !
Pure Gold
excellent explanation
Thankyou sir for this amazing video.
clearly explained
Amazing!
You are best in business
The best explanation
wow is my 1 st expression after watching this vedio....
00:02 PCA aims to reduce dimensionality while maintaining data essence.
02:55 Projection and unit vector for PCA
10:27 Principle Component Analysis (PCA) helps to find the direction for maximum variance.
12:48 Variance measures the spread of data
19:22 Principal Component Analysis (PCA) helps in understanding the spread and orientation of data.
21:56 PCA provides complete information about data spread and orientation.
27:10 Principle Component Analysis involves transformations and changing directions of vectors.
29:39 Linear transformation does not change vector direction.
34:24 Principal Component Analysis (PCA) uses eigenvectors for linear transformation.
36:36 Principal Component Analysis (PCA) helps identify vectors with the highest variation in data.
41:55 Principal Component Analysis allows transforming data and creating new dimensions.
44:15 PCA involves transforming the dataset to a new coordinate system
49:14 Using PCA to find the best two-dimensional representation of 3D data
52:07 Principle component analysis (PCA) involves transforming and transporting the data.
Crafted by Merlin AI.
Thank you sir
very very valuable.
This is even better than Josh Starmer's video.
I am Gonna Mad😶 ; You are Truly Legend 🔥
Best!
😭😭😭😭thanks a lot sir, thank you so much
wow kudus fot the explanation
guru G aapke charan kha h ...GOD bless you sir
Request you to continue deep learning series
You are ML guru, 🙏
bhai chad explanation hai
One small correction: 52:55
Eigen Vectors are the COLUMNS of the matrix which is given as the output of np.linalg.eig() not the rows which you have used...
please correct me if I am wrong
Do a series on time series analysis and NLP please
Sir, I want to know ki how libraries are working , agar aap uspe basic explanation batade , baki to this series is very awesome .
love you sir ❣ crush updated
sabse bada eigenvector sabse bade eigen value ke corresponding hoga , but ek eigen value ke corresponding more than one eigen vector hotai hai , infact poora eigen space hota hai (except 0 vector ofcourse)!! in R2 plane ,it will have uncountable eigen vectors corresponding to the largest eigen value
ooh i figured it out , i think if the eigen vectors are LD they have the same direction and direction is what matters , and i we have and LI one , then we will have one more u which works equivalently good
wow sir thanks, you are the best pr apne aage videos Q nhi bnayi hai unsupervised learning p mein wt kr rha hu
plz reply
One stop everything
best💗
one question do we need to sort the eigen vectors based on highest eigen values and then choose the eigen vectors accordingly?
also sum of top K eigen values will show us how many eigen vectors we need to take (in case of high dimensional data)
I think, There can only be n eigen values for a n*n matrix. And n unit eigen vectors for it. but can be as many as eigen vectors as possible. we need to just multiply by some k to to that unit eigen vectors to get some more eigen vectors. :)
best
17:22,why to use covariance matrix why not correlation
👌👌🔥🔥🔥🔥🔥
Hi Sir just to understand the concept well, so when we do the transformation of the data D, do we use the matrix of Eigen Vector (Calculated using the Covariance Matrix) or do we use the Covariance Matrix itself? Its using the matrix of Eigen Vector right?
@CampusX, Sir please help with t-Sne Algorithm also