Principle Component Analysis (PCA) | Part 2 | Problem Formulation and Step by Step Solution

Поділитися
Вставка
  • Опубліковано 7 лип 2024
  • This video breaks down the problem formulation and offers a step-by-step solution guide. Enhance your understanding of PCA and master the techniques for dimensionality reduction in your data.
    Code used: github.com/campusx-official/1...
    About Eigen Vectors:
    www.visiondummy.com/2014/04/g....
    • Eigenvectors and eigen...
    Plotting tool used:
    www.geogebra.org/m/YCZa8TAH
    ============================
    Do you want to learn from me?
    Check my affordable mentorship program at : learnwith.campusx.in/s/store
    ============================
    📱 Grow with us:
    CampusX' LinkedIn: / campusx-official
    CampusX on Instagram for daily tips: / campusx.official
    My LinkedIn: / nitish-singh-03412789
    Discord: / discord
    E-mail us at support@campusx.in
    ⌚Time Stamps⌚
    00:00 - Practical Example on MNIST Dataset
    00:33 - Problem Formulation
    12:55 - Covariance and Covariance Matrix
    23:17 - Eigen Vectors and Eigen Values
    25:37 - Visualizing Linear Trasnformations
    35:35 - Eigendecompostion of a covariance Matrix
    38:04 - How to solve PCA
    43:41 - How to transform points?
    48:18 - Code Demo with Vizualization
    56:00 - Outro

КОМЕНТАРІ • 143

  • @ADESHKUMAR-yz2el
    @ADESHKUMAR-yz2el 11 місяців тому +22

    गुरुर्ब्रह्मा ग्रुरुर्विष्णुः गुरुर्देवो महेश्वरः।
    गुरुः साक्षात् परं ब्रह्म तस्मै श्री गुरवे नमः।।
    you are the real teacher sir.. 💫

  • @akash.deblanq
    @akash.deblanq 2 роки тому +93

    This is hands down the best PCA explanation I have seen on the internet. Period!

  • @amartyatalukdar1024
    @amartyatalukdar1024 4 місяці тому +6

    I have studied eigenvalues and eigenvectors, multiple times, but this video explained to me the depth of it in a very simple way! One of the best teachers out there.

  • @sudhanshusingh5594
    @sudhanshusingh5594 2 роки тому +14

    Feature extraction perfect example is u sir
    High dimension knowledge ko 2D me convert krk smjhne layak bana dete ho..
    Thankq so much sir.

  • @avinashpant9860
    @avinashpant9860 Рік тому +22

    I am always amazed how important the concept of eigenvectors and eigenvalues are, they are one of the most important concepts of quantum mechanics. Every operator( ex- energy, momentum) in Q.Mech is a linear operator and our aim usually is to find the corresponding eigenvectors and eigenvalues. Time-independent Schrödinger eq usually takes the form of eigenvalue equation Hψ =Eψ. It's so amazing to see how these concepts are finding their role in Machine Learning as well. MY love for Math keeps on growing. As always thank you for your amazing videos

  • @bibhutibaibhavbora8770
    @bibhutibaibhavbora8770 7 місяців тому +4

    Omg in one video he explains the most difficult linear algebra topic and applied it to machine learning and also showed us the code. Hats off

    • @Lets_do_code-vl7im
      @Lets_do_code-vl7im 6 місяців тому

      its a video of 3 parts and in a way he is explaining dum man can understand the concept

  • @lightyagami7085
    @lightyagami7085 2 роки тому +6

    No one has ever explained Eigen Vectors in such a simple way, You are awesome !!

  • @AltafAnsari-tf9nl
    @AltafAnsari-tf9nl Рік тому +10

    You are simply amazing. I bet no one can teach PCA so well.

  • @alkalinebase
    @alkalinebase Рік тому +2

    Honestly everything I know, I owe it to you, thankyou for being the real HERO in the need!!!

  • @pulimiyashwanth9925
    @pulimiyashwanth9925 9 місяців тому +2

    this channel is so underated,by seeing this pca video everyone one can understand dimensonality reduction,thank you sir for the hard work

  • @11aniketkumar
    @11aniketkumar 8 місяців тому

    गुरू ब्रह्मा गुरू विष्णु, गुरु देवो महेश्वरा
    गुरु साक्षात परब्रह्म, तस्मै श्री गुरुवे नमः
    Aapko koti koti naman hai sir, jo ye gyan aapne youtube ke jariye hum sabko diya

  • @samikshakolhe5086
    @samikshakolhe5086 Рік тому +2

    The Epic PCA explaination ever seen on UA-cam and never done by any DS UA-camr Yet, Hat's off to your teaching skills sir

  • @singnsoul6443
    @singnsoul6443 7 місяців тому

    I am blown away by understanding the true meaning of eigen vectors. I always knew the definition but now I have understood the meaning. You are a savior!

  • @akshaythakor5501
    @akshaythakor5501 6 місяців тому

    Actually I was learning PCA for the first time. When I watched the video for the first time I didn't understand it but when I watched it a second time then all the topics very clearly. This video is amazing

  • @satyamgupta4808
    @satyamgupta4808 9 місяців тому

    Sir No one in UA-cam taught this much intutive. Even paid course cant teach this much indepth

  • @bhavikpunmiya9641
    @bhavikpunmiya9641 3 місяці тому

    Thankyou So Much Sir, You not only Cleared my doubt's about how PCA works, but also for the first time gave me mathematical intitution of Eigen Value and Eigen Vector and even Matrices Transformation which I am learning from previous so many years
    Best Explaination I've seen regarding this topic

  • @varunahlawat9013
    @varunahlawat9013 Рік тому +2

    The most satisfying Machine Learning lecture that I've ever seen by far🤩🤩

  • @soumilyade1057
    @soumilyade1057 Рік тому +2

    45:44 I think it's gonna be (3,1) and when transposed it's gonna be (1,3) which then is multiplied with the matrix representing the dataset. (1,3) × (3, 1000) . This representation is valid too

  • @farhansarguroh8680
    @farhansarguroh8680 Рік тому +1

    My head hurts, this is sooo descriptive and apt and worthy enough of all the time. best. Kuddos

  • @krishnendubarman8490
    @krishnendubarman8490 Рік тому +5

    You are really a good teacher, I am in IIT Bombay, Environmental Engineering, Mtech , but I wanted to learn ML, this playlist is so far best understandable for me.

  • @vashugarg2072
    @vashugarg2072 Рік тому +1

    Best Teaching Skills I have ever seen ,for All Machine Learning concepts, Hats of you Sir!🎉🎉🎊🎊

  • @aounhaider8335
    @aounhaider8335 11 місяців тому

    You have cleared my concept which was not well explained by any other instructor on youtube! Great job❤❤

  • @nishantgoyal6657
    @nishantgoyal6657 5 місяців тому

    One of the best videos I found for PCA. You have great skills brother.

  • @muhammadumair1280
    @muhammadumair1280 2 роки тому +2

    Love from KARACHI ,PAKISTAN.

  • @tr-GoodVibes
    @tr-GoodVibes Рік тому +1

    This is called Teaching ! Thanks for this wonderful explanation.

  • @bikimaharana9350
    @bikimaharana9350 2 роки тому +1

    Maza aagaya bhai itna achha explanation sayad hi kisine youtube me kiya hoga.

  • @ali75988
    @ali75988 6 місяців тому +1

    One small explanation of shortcut in lecture at 16:04,
    co-variance actual formula includes xmean and ymean, here both were zero, that's why shortcut sum(x*y)/3
    formula for covariance is:
    covariance(x,y) = summation[(x-xmean) (y-ymean)] / n
    basically this is the same reason, covariance matrix has variance at diagonal 22:57
    both features are same x
    so covariance(x,x) = summation[(x-xmean)(x-xmean)]/n
    which is actually the formula for variance

  • @QAMARRAZA-pm6nc
    @QAMARRAZA-pm6nc 3 місяці тому

    how can i thank you, what a wonderful teacher available for free for the help of many students

  • @ritugujela8345
    @ritugujela8345 Рік тому +1

    Thank you so much sir, you always leave us awestruck by your remarkable explanation and in-depth knowledge. I never knew this topic can be explained with this much clearity.
    The teacher I never knew I needed in my life ❤️✨

  • @krithwal1997
    @krithwal1997 2 роки тому +3

    This Channel is Goldmine 🙌

  • @deepanshugoel3790
    @deepanshugoel3790 Рік тому

    you have made a complex topic like PCA so easy for us to understand

  • @sudiptahalder423
    @sudiptahalder423 Рік тому

    Best explanation of PCA I've ever seen!! ❤️

  • @bangarrajumuppidu8354
    @bangarrajumuppidu8354 2 роки тому +1

    what an amzing explanation very intutive am follwing ur whole series sir

  • @anshuman4mrkl
    @anshuman4mrkl 3 роки тому +1

    Amazingly explained. 🤩👏🏻

  • @monicakumar6769
    @monicakumar6769 8 місяців тому

    This is the best video I have watched on this topic!

  • @ssh0059
    @ssh0059 Місяць тому

    Wow best video on PCA on internet

  • @nitinchityal583
    @nitinchityal583 Рік тому +1

    Speechless .....you deserve a million subscribers at least

    • @campusx-official
      @campusx-official  Рік тому

      Natural Language Processing(NLP): ua-cam.com/play/PLKnIA16_RmvZo7fp5kkIth6nRTeQQsjfX.html

  • @susamay
    @susamay 9 місяців тому

    u r the best Nitish. Thanks for all these.

  • @pravinshende.DataScientist
    @pravinshende.DataScientist 2 роки тому +1

    your content is best ever!! thank your sir!

  • @osho_magic
    @osho_magic Рік тому

    I have seen and understand lin alg playlist at 3blue1 brown But you enhances my doubts clarifications even more thanks

  • @vijendravaishya3431
    @vijendravaishya3431 4 дні тому +1

    those who are wondering why three eigen vectors everytime, because covariance matrix is a symmetric matrix, and Real Symmetric Matrices have n linearly independent and orthogonal eigenvectors. zero vector is not considered an eigen vector although it satisfies Ax= λx, like wise there might be upto n LI eigen vectors for n*n symmetrix matrix

  • @world4coding
    @world4coding Рік тому

    sir ye dekh kar maja ko maja aaa gaya.
    kya hi bolu sir ab .
    PCA ka Best video tha ye.
    Love you sir.

  • @pravinshende.DataScientist
    @pravinshende.DataScientist 2 роки тому +4

    wow content ! you are playing a big role for me to make me as a data scientist .. thank you sir!

    • @akshaypatil8155
      @akshaypatil8155 Рік тому

      Hi pravin if u have got the job, could you guide me a little. I have questions related to how the work gets distributed in a data science dept. of a company. How the data science dept works and how the work gets distributed etc etc...Could u plz share ur email?

    • @shubhamagrahari9745
      @shubhamagrahari9745 8 місяців тому

      ​@@akshaypatil8155no 👎

  • @ashishshejwal8514
    @ashishshejwal8514 Рік тому

    Speechless ,too good to grasp

  • @sameerabanu3115
    @sameerabanu3115 8 місяців тому

    extremely superb explanation, kudooss

  • @pramodshaw2997
    @pramodshaw2997 2 роки тому +2

    god bless you. wonderful session

  • @namansethi1767
    @namansethi1767 2 роки тому +1

    Thanks sir for this amazing explanation

  • @anoopkaur6119
    @anoopkaur6119 9 місяців тому

    Awesome video, omg , u explained every concept so clearly. thx a lot sir

  • @hello-iq6yz
    @hello-iq6yz Рік тому

    Amazing clarity !!!

  • @Rupesh_IITBombay
    @Rupesh_IITBombay 5 місяців тому

    Thank You sir.
    such a crisp explaination....

  • @somanshkumar1325
    @somanshkumar1325 Рік тому

    Brilliant explanation! Thank you so much :)

  • @brajesh2334
    @brajesh2334 3 роки тому +1

    very excellent explanation.....

  • @TheMLMine
    @TheMLMine 7 місяців тому

    Great step-by-step explanation

  • @rafibasha1840
    @rafibasha1840 2 роки тому +2

    Thanks for the excellent video bro ,@16:21 in covariance we substrate values from mean and then multiply right

  • @gauravpundir97
    @gauravpundir97 Рік тому

    Great explanation @Nitish

  • @abhijitkumar7831
    @abhijitkumar7831 2 місяці тому

    Amazing Tutorial

  • @ajaykushwaha4233
    @ajaykushwaha4233 3 роки тому +2

    Awesome 👏👍🏻

  • @harsh2014
    @harsh2014 Рік тому

    Thanks for explanations!

  • @shaan200384
    @shaan200384 Рік тому

    Excellent!!

  • @aryastark4064
    @aryastark4064 8 місяців тому

    you are a saviour to my sinking boat❣. thanks a lot.

  • @lijindurairaj2982
    @lijindurairaj2982 2 роки тому +1

    was very useful for me, thank you :)

  • @pavangoyal6840
    @pavangoyal6840 Рік тому

    Excellent !!!

  • @amirman6
    @amirman6 Рік тому

    pretty good explanation of doing PCA computatinally without using the sklearn

  • @ParthivShah
    @ParthivShah 4 місяці тому +1

    Thank You Sir.

  • @sarumangla6030
    @sarumangla6030 Рік тому

    Justt soo awesome ! Cant describe !

  • @DimLightPoetries
    @DimLightPoetries Рік тому

    Pure Gold

  • @gauravagrawal8078
    @gauravagrawal8078 6 місяців тому

    excellent explanation

  • @RitikaSharma-pt9ox
    @RitikaSharma-pt9ox Рік тому

    Thankyou sir for this amazing video.

  • @VIP-ol6so
    @VIP-ol6so 3 місяці тому

    clearly explained

  • @lakshityagi684
    @lakshityagi684 Рік тому

    Amazing!

  • @sushantsingh1133
    @sushantsingh1133 Місяць тому

    You are best in business

  • @roboioters
    @roboioters Рік тому

    The best explanation

  • @pravinshende.DataScientist
    @pravinshende.DataScientist 2 роки тому +1

    wow is my 1 st expression after watching this vedio....

  • @sahilkirti1234
    @sahilkirti1234 3 місяці тому +1

    00:02 PCA aims to reduce dimensionality while maintaining data essence.
    02:55 Projection and unit vector for PCA
    10:27 Principle Component Analysis (PCA) helps to find the direction for maximum variance.
    12:48 Variance measures the spread of data
    19:22 Principal Component Analysis (PCA) helps in understanding the spread and orientation of data.
    21:56 PCA provides complete information about data spread and orientation.
    27:10 Principle Component Analysis involves transformations and changing directions of vectors.
    29:39 Linear transformation does not change vector direction.
    34:24 Principal Component Analysis (PCA) uses eigenvectors for linear transformation.
    36:36 Principal Component Analysis (PCA) helps identify vectors with the highest variation in data.
    41:55 Principal Component Analysis allows transforming data and creating new dimensions.
    44:15 PCA involves transforming the dataset to a new coordinate system
    49:14 Using PCA to find the best two-dimensional representation of 3D data
    52:07 Principle component analysis (PCA) involves transforming and transporting the data.
    Crafted by Merlin AI.

  • @heetbhatt4511
    @heetbhatt4511 9 місяців тому

    Thank you sir

  • @core4032
    @core4032 2 роки тому

    very very valuable.

  • @adityabhatt04
    @adityabhatt04 2 роки тому +1

    This is even better than Josh Starmer's video.

  • @jiteshsingh6030
    @jiteshsingh6030 2 роки тому +1

    I am Gonna Mad😶 ; You are Truly Legend 🔥

  • @balrajprajesh6473
    @balrajprajesh6473 Рік тому +1

    Best!

  • @AbcdAbcd-ol5hn
    @AbcdAbcd-ol5hn Рік тому

    😭😭😭😭thanks a lot sir, thank you so much

  • @rashmiranjannayak8965
    @rashmiranjannayak8965 9 місяців тому

    wow kudus fot the explanation

  • @descendantsoftheheroes_660
    @descendantsoftheheroes_660 11 місяців тому

    guru G aapke charan kha h ...GOD bless you sir

  • @pavangoyal6840
    @pavangoyal6840 Рік тому +1

    Request you to continue deep learning series

  • @pradeepmarpatla5498
    @pradeepmarpatla5498 7 місяців тому

    You are ML guru, 🙏

  • @kosttavmalhotra5899
    @kosttavmalhotra5899 9 місяців тому

    bhai chad explanation hai

  • @morancium
    @morancium Рік тому +1

    One small correction: 52:55
    Eigen Vectors are the COLUMNS of the matrix which is given as the output of np.linalg.eig() not the rows which you have used...
    please correct me if I am wrong

  • @nitinchityal583
    @nitinchityal583 Рік тому +1

    Do a series on time series analysis and NLP please

  • @core4032
    @core4032 2 роки тому

    Sir, I want to know ki how libraries are working , agar aap uspe basic explanation batade , baki to this series is very awesome .

  • @abrarvlogs6931
    @abrarvlogs6931 9 місяців тому

    love you sir ❣ crush updated

  • @kindaeasy9797
    @kindaeasy9797 6 місяців тому

    sabse bada eigenvector sabse bade eigen value ke corresponding hoga , but ek eigen value ke corresponding more than one eigen vector hotai hai , infact poora eigen space hota hai (except 0 vector ofcourse)!! in R2 plane ,it will have uncountable eigen vectors corresponding to the largest eigen value

    • @kindaeasy9797
      @kindaeasy9797 6 місяців тому

      ooh i figured it out , i think if the eigen vectors are LD they have the same direction and direction is what matters , and i we have and LI one , then we will have one more u which works equivalently good

  • @islamicinsights6342
    @islamicinsights6342 2 роки тому +1

    wow sir thanks, you are the best pr apne aage videos Q nhi bnayi hai unsupervised learning p mein wt kr rha hu
    plz reply

  • @Hellow_._
    @Hellow_._ Рік тому

    One stop everything

  • @yashjain6372
    @yashjain6372 Рік тому

    best💗

  • @pramodshaw2997
    @pramodshaw2997 2 роки тому +1

    one question do we need to sort the eigen vectors based on highest eigen values and then choose the eigen vectors accordingly?
    also sum of top K eigen values will show us how many eigen vectors we need to take (in case of high dimensional data)

  • @BAMEADManiyar
    @BAMEADManiyar 8 місяців тому +1

    I think, There can only be n eigen values for a n*n matrix. And n unit eigen vectors for it. but can be as many as eigen vectors as possible. we need to just multiply by some k to to that unit eigen vectors to get some more eigen vectors. :)

  • @yashjain6372
    @yashjain6372 Рік тому +1

    best

  • @rafibasha4145
    @rafibasha4145 2 роки тому +2

    17:22,why to use covariance matrix why not correlation

  • @itz_me_imraan02
    @itz_me_imraan02 7 місяців тому

    👌👌🔥🔥🔥🔥🔥

  • @josebgeorge227
    @josebgeorge227 3 місяці тому

    Hi Sir just to understand the concept well, so when we do the transformation of the data D, do we use the matrix of Eigen Vector (Calculated using the Covariance Matrix) or do we use the Covariance Matrix itself? Its using the matrix of Eigen Vector right?

  • @IRFANSAMS
    @IRFANSAMS 2 роки тому +1

    @CampusX, Sir please help with t-Sne Algorithm also