PCA Indepth Geometric And Mathematical InDepth Intuition ML Algorithms

Поділитися
Вставка
  • Опубліковано 28 лис 2024

КОМЕНТАРІ • 90

  • @exploreEverything4519
    @exploreEverything4519 Рік тому +19

    First I understood pca concept 3 years back from nptel lecture. It was full of mathematics and It went far above my head because the theory part was missing. Believe me with your explanations I can understand his lecture too. No one could explain the way you have explained. It was outstanding.

  • @IshanGarg-y1u
    @IshanGarg-y1u Рік тому +13

    This is a good video, I recommend first you watch PCS step by step guide from stat quest to get a high level view with animations, then you watch this video to get more details and understanding alongside some code. Then in case you want to know the mathematics behind it refer to some articles online where the explain why we calculate the covariance matrix, then build the objective function using lagrange multiplier and then derive why eigen values of covariance matrix are the desired results

  • @aditinautiyal4299
    @aditinautiyal4299 Рік тому +6

    Thank you so much for not only sharing your knowledge but also putting so much effort to cover each and every point of the particular topic.

  • @aj_actuarial_ca
    @aj_actuarial_ca 9 місяців тому +1

    PCA is so very well explained in your video sir. You're really the best teacher ever !!!

  • @pritamrajbhar9504
    @pritamrajbhar9504 6 місяців тому +4

    thanks a lot, Krish this is the simplest and most detailed video about PCA.

  • @man9mj
    @man9mj 8 місяців тому +1

    thank you for this elegant effort in explaining PCA

  • @akashpaul9892
    @akashpaul9892 Рік тому +2

    You really are a good teacher brother... Teaching with relatable examples help to understand each topic so perfectly and easily.. Thank you so much brother.. Keep teaching us...
    Love from Bangladesh

  • @syco-brain8543
    @syco-brain8543 2 місяці тому +1

    best video about pca on internet so far

  • @Harsh_Yadav_IITKGP
    @Harsh_Yadav_IITKGP Рік тому +1

    Krish your efforts are remarkable in this ml series.....

  • @taslima5007
    @taslima5007 8 місяців тому

    You are my favourite youtuber and teacher.

  • @dipamsarkar6626
    @dipamsarkar6626 Рік тому +1

    This guy should be named as "God father of Data Science India" an absolute legend

  • @ashwintiwari9642
    @ashwintiwari9642 Рік тому

    No where I can find this explanation it's too good no confusion no complex demonstration use cases a cleanest and simplest way to understand PCA in depth thanks alot Krish it takes lot of takes and research to explain single topics in data science and in this way it's all appreciated work

  • @vinothkumar7531
    @vinothkumar7531 10 місяців тому

    You are a great teacher I ever seen in my entire life.The way you are teaching even makes the lazy or slow learner to a strong learner using Krish Naik g(ji) Boosting algorithm.Just Kidding 😃😃.Hatsoff to your effort to help the people.

  • @samareshms4591
    @samareshms4591 7 місяців тому +3

    This guy is single handedly carrying the AI ML community in the India 🙇‍♂🙇‍♂

  • @adnanshujah6230
    @adnanshujah6230 7 місяців тому

    best of the best lecture .covers all the required concepts about subject . most of videos available only shows how to perform PCA but not whay it is required and concept behind it .but sir Krish thankyou so much for such a detailed lecture and clearing the concepts . highly recommended lecture and his channel
    🥰🥰🥰🥰🥰🥰

    • @adnanshujah6230
      @adnanshujah6230 7 місяців тому

      i simply say this one video is enough to get the clear concept ;once again thankyou soooooo .... much sir Krish

  • @yogendrapratap1982
    @yogendrapratap1982 Рік тому +15

    Everything had been really resourceful in lecture series but this lecture was overly extended, 30 min topic has been extended to 1 hours 30 mins repeating same stuff again and again

  • @SanthoshKumar-dk8vs
    @SanthoshKumar-dk8vs Рік тому +3

    Thanks for sharing Krish really helpfull, last two days am refreshing this topic only🤗

  • @amitx26
    @amitx26 8 місяців тому

    Sir, I thing have felt strongly is that you expain and deliver a little better in recorded videos. Thanks for providing such great content for us for free!

    • @RakshithML-vo1tr
      @RakshithML-vo1tr 7 місяців тому

      Hi bro I am starting data science how can I start? By seeing Krish sir roadmap and like u said should I prefer recorded videos

  • @paneercheeseparatha
    @paneercheeseparatha Рік тому +4

    Wonderful try to explain PCA without much mathematics. Though it would be great if you also do a video on implementing PCA from scratch in python. Loved your playlist! kudos to you!

  • @pankajray5939
    @pankajray5939 Рік тому +1

    PCA is one of the important topics of ML

  • @viratkumar9161
    @viratkumar9161 Рік тому +2

    Its quite vage to say if pearson correlation value is zero there is no relationship between x and y. Example consider Y= mod(X) line the person correlation is 0, but still there is relationship easily visible after plotting

    • @SiddharthSwamynathan
      @SiddharthSwamynathan Рік тому

      Correct. Pearson correlation has the capacity only to capture the linear relationship. Coefficient 0, would be no linear relationship exists. But there exists a possibility of a non linear relationship within the covariates and target.

  • @ramakrishnayellela7455
    @ramakrishnayellela7455 7 місяців тому

    Such a good explanation krish

  • @kvafsu225
    @kvafsu225 Рік тому

    Excellent presentation.

  • @baravind6548
    @baravind6548 6 місяців тому +2

    In extracting from 2D to 1D, if PC1 has the higer varience and PC2 has 2nd higher varience. Is it nessesary that PC1 should be perpendicular to PC2?

  • @IzuchukwuOkafor-v6e
    @IzuchukwuOkafor-v6e 8 місяців тому

    Very lucid explanation of PCA.

  • @irisshutterwork1411
    @irisshutterwork1411 Рік тому +1

    Well explained. Thank you

  • @mr.pianist
    @mr.pianist 3 місяці тому

    very good lec beginner friendly

  • @thop9747
    @thop9747 Рік тому

    was really helpful. Keep up the work sir.

  • @shivachauhan2837
    @shivachauhan2837 Рік тому +2

    To improve my resume what should I try kaggle Or open source

  • @unicornsolutiongh2022
    @unicornsolutiongh2022 Рік тому

    powerfull lecture. keep it up sir

  • @AjayPatel-pc1yf
    @AjayPatel-pc1yf Рік тому

    Gjb sir mja aa gaya❤

  • @harshitsamdhani1708
    @harshitsamdhani1708 11 місяців тому

    Thank You for the video

  • @manikandanm3277
    @manikandanm3277 Рік тому +3

    In theory part, to find the eigen values, you multiply the covariance matrix with a vector. How's that particular vector V is chosen and used to multiply with the covariance matrix? I'm confused with this only, otherwise a great lecture, thanks krish👍

    • @priyam39
      @priyam39 Рік тому

      That v is the eigen vector itself we are looking for.Sir just explained

  • @PAVVamshhiKrishna
    @PAVVamshhiKrishna 3 місяці тому

    Fantastic

  • @yachitmahajan3579
    @yachitmahajan3579 7 місяців тому

    best explanation

  • @bhagyashriakolkar7763
    @bhagyashriakolkar7763 Рік тому

    Thank you sir....nice explanation

  • @sumankumar01
    @sumankumar01 Рік тому +1

    Campus x and you both refer same books or what since the example is same ?

  • @chayanikaboruha6657
    @chayanikaboruha6657 8 місяців тому

    Krish please make a video regarding how we can use auto encoder for text data

  • @the-ghost-in-the-machine1108

    Thanks sir, god bless you!

  • @javeedtech
    @javeedtech Рік тому

    Thanks for video, from fsds batch 2

  • @jitendrasahay3847
    @jitendrasahay3847 Місяць тому

    If we have 3 features then we are getting 3 eigen vectors and later we combine 2 out of them to create 1 eigen vector. Combining here basically mean projection. Earlier when we projected we got n eigen vectors out of n feature then again we will get 2 eigen vectors. Where the dimensionality reduction is happening???
    What I m missing here really???
    Can anyone help ???

  • @Nikhillllllllllllll
    @Nikhillllllllllllll 11 місяців тому +1

    how to get names of those 2 features we got after feature extraction

  • @BMVLM-
    @BMVLM- 12 днів тому

    Bhai content mast hai lekin advertisment bhot sare hai bot disturbing.

  • @lagangupta3193
    @lagangupta3193 3 місяці тому

    How will we decide the number of features that we have to mention in n_components?

  • @CodeWonders_
    @CodeWonders_ Рік тому +1

    Can you tell me who will teach in data science course you or sudhanshu sir ?

  • @eurekad7340
    @eurekad7340 2 місяці тому +1

    If possible could you please make video on truncated svd as well. I searched but I couldn't find any video on svd from you

  • @user-rx5kq6oo9y
    @user-rx5kq6oo9y Рік тому +4

    Bro can you make cheat sheet of data science like multiple dsa
    sheets on youtube?

  • @RahulA-b9o
    @RahulA-b9o Рік тому

    How do i know that the model is over feeded.. any method to find out that the model trained is under curse of Dimensionality???????

  • @mohitkumarsingh7318
    @mohitkumarsingh7318 8 місяців тому +1

    Sir pls, also cover SVD , it's a request

  • @BharatDhungana-n4s
    @BharatDhungana-n4s 9 місяців тому

    implementation is best

  • @kunalpandya8468
    @kunalpandya8468 Рік тому

    After we get 2 features from pca, what is the name of those two features?

  • @samthomas3881
    @samthomas3881 9 місяців тому

    Thanks Sir!

  • @arungireesh686
    @arungireesh686 Рік тому

    superb

  • @baravind6548
    @baravind6548 6 місяців тому

    How to get the vector v? that is to be multiplied by A

  • @muhammadrafiq1720
    @muhammadrafiq1720 Рік тому

    There is Ad after each 3 to 4 minets , difficult to concentrate especially with low speed inter et.

  • @viratjanghu945
    @viratjanghu945 Рік тому

    Sir please make a video on the independent component analysis and linear discriminant analysis it is my humble request sir please

  • @somnath1235
    @somnath1235 Рік тому +1

    What does the covariance and corelation decide ? Does covariance denotes how closely 2 features exist? And does corelation denotes whether the features are directly or inversely proportional?

    • @saisrinivas3066
      @saisrinivas3066 Рік тому

      covariance only describes the type of relationship whereas correlation describes the type and strength of the relationship between two numerical variables

    • @bhargav1811
      @bhargav1811 Рік тому

      Correlation is scaled version of covariance !!!!
      Range of covariance = (-inf,+inf)
      Range of correlation = (-1,+1)

    • @Datadynamo
      @Datadynamo Рік тому +3

      Covariance is a measure of the joint variability of two random variables. It tells you how two variables are related to each other. A positive covariance means that the variables are positively related, which means that as one variable increases, the other variable also tends to increase. A negative covariance means that the variables are inversely related, which means that as one variable increases, the other variable tends to decrease.
      Correlation is a normalized version of covariance, it gives the measure of the strength of the linear relationship between two variables. It ranges from -1 to 1, where -1 is the perfect negative correlation, 0 is no correlation and 1 is perfect positive correlation. Like covariance, it tells you how two variables are related to each other, but it gives you a more intuitive sense of the strength of the relationship, as it is scaled between -1 and 1.

  • @mr.patientwolfx5984
    @mr.patientwolfx5984 Рік тому

    sir what do you think of guvi data science program? can i join.

  • @Bitter_Truth-zc4eq
    @Bitter_Truth-zc4eq 10 місяців тому

    Which software are you using for writing?

    • @KRSandeep
      @KRSandeep 7 місяців тому

      Scrble Ink which is available for windows laptop only

  • @MamunKhan-px2vb
    @MamunKhan-px2vb Рік тому

    Just Great

  • @ramdharavath7542
    @ramdharavath7542 Рік тому

    Useful

  • @ITSimplifiedinHINDI
    @ITSimplifiedinHINDI 4 місяці тому

    Greater than ko Less than aur Less Than ko Greater Than, kyoun likh rahe ho Guruji.

  • @shanthan9.
    @shanthan9. 9 місяців тому

    Good video but too lengthy

  • @siddharthmohapatra7297
    @siddharthmohapatra7297 Рік тому

    Sir I want to ask ...I have no coding skills and background...bcom Background
    Can I do data science masters from pw skills ... everything will be taught from verry basics ???

    • @rutvikchauhan1572
      @rutvikchauhan1572 Рік тому +1

      You can do it, first learn python , then search data science cources on youtube and on various apps like udemy , coursera , swayam...... And enrolled on it......

    • @siddharthmohapatra7297
      @siddharthmohapatra7297 Рік тому

      @@rutvikchauhan1572 I have enrolled in pw skills

    • @anuraganand6675
      @anuraganand6675 Рік тому

      @Rutvik Chauhan how is you feedback of pw skills data science course?

    • @akindia8519
      @akindia8519 5 місяців тому

      ​@@siddharthmohapatra7297 hi can you please give us feedback of pw skills' data science masters program?

  • @theharvi_
    @theharvi_ 7 місяців тому

    ❤thx

  • @SohanDeshar-pf6zh
    @SohanDeshar-pf6zh 5 місяців тому

    Good explanation but it might be a good idea to remove one of the "InDepth"s from the video title.

  • @vaibhavyadav-w8g
    @vaibhavyadav-w8g Рік тому

  • @AmmarAnjum-h2s
    @AmmarAnjum-h2s 10 місяців тому +2

    Why sir you don't talk point to point things..repeating everything again and missing some stuff to talk

  • @siddhisg
    @siddhisg 11 місяців тому +1

    greater than less than symbol though🥲

  • @shruti9731
    @shruti9731 8 місяців тому

    ❤❤

  • @jitendrasahay3847
    @jitendrasahay3847 Місяць тому

    I have to say : a very short precise material has been elongated irritatingly.
    Repetative statements...

  • @satyapujari7731
    @satyapujari7731 Рік тому

    After every five minutes, there was an advertisement, which made it difficult to concentrate while watching videos.

  • @vishalgupta9620
    @vishalgupta9620 Рік тому +1

    noob knows nothing

  • @priyotoshsahaThePowerOf23
    @priyotoshsahaThePowerOf23 Рік тому

    BEST