Eigendecomposition : Data Science Basics

Поділитися
Вставка
  • Опубліковано 12 січ 2025

КОМЕНТАРІ • 143

  • @zigzagarmchair2367
    @zigzagarmchair2367 10 місяців тому +15

    omg, you are really great at explaining things by using only a pen and a whiteboard, without the need for fancy digital animation, this is definitely what I call a REAL "Education"!!!

    • @nizogos
      @nizogos 17 днів тому

      Does fancy digital animation make education worse for you?It offers the insights of experts in those subjects directly to you, without having to study a subject 20 years to understand it in depth.

  • @Vinladar
    @Vinladar 4 роки тому +45

    This is definitely a great explanation of eigendecomposition.
    I kind of got into this rabbit hole trying to understand singular value decomposition, and this video helped me understand that as well.
    Thanks for your help understanding this.

  • @lenaso4555
    @lenaso4555 3 роки тому +2

    Holy shit you're literally blowing my mind (in a positive way) with your videos. I've never understood Eigendecomposition (and many more of the topics you're explaining) but now it all makes sense. Please never stop with your videos!

  • @verule3928
    @verule3928 27 днів тому

    Thank you so much for explaining this so clearly. I was struggling to understand this for so long and you just made it so easier. You are an excellent teacher!!

  • @dalisabe62
    @dalisabe62 Місяць тому

    And it is affordable. Thirty lectures like this could be an entire course in linear algebra priced at a fraction of what universities charge in tuition and save money, time, classroom space and energy on campus commute. However, we could only go too far doing matrices by hand, so the course would need a software package like Mathematica, Matlab or Maple to crunch the numbers. Thanks a great deal for the quality presentation.

  • @martinpign868
    @martinpign868 4 роки тому +5

    Finally, someone that shows it simple and clear and answers the most important question: why? Thank you!

  • @tojewel
    @tojewel 4 роки тому +2

    Wish I could give more than one like. This channel is so underrated.

  • @andreipitkevich1088
    @andreipitkevich1088 Рік тому +4

    Surprisingly good explanation. Thanks a lot! I especially liked that all the information goes in order without gaps and an example of practical application is given.

  • @zz-9463
    @zz-9463 4 роки тому +4

    Never seen such a clear explanation! Thank you so much!

  • @dimidoff5431
    @dimidoff5431 4 роки тому +7

    This is a great explanation, been stuck trying to understand PCA and this really helps

  • @amisha065
    @amisha065 9 місяців тому

    I'm just learning these basics and your videos are very comprehensive and highly informative. Looking forward to completing all the videos in the playlist!!

  • @365HockeyGirl
    @365HockeyGirl 4 роки тому

    Watched this video as a refresher for my ML class and it was super helpful. Thanks!!!

  • @derrickagyemang1259
    @derrickagyemang1259 2 місяці тому

    Great video, love the clarity of the explanation

  • @sainandankandikattu9077
    @sainandankandikattu9077 4 роки тому +1

    Honestly.... U deserve atleast a million subscribers.... A moron professor in our Econometrics class didn't even try to do this in his class! Thanks professor ritvik!

    • @saraaltamirano
      @saraaltamirano 4 роки тому +2

      While Ritvik is indeed A-MA-ZING, perhaps you should be a bit nicer to your econometrics professor :-)

  • @蔡小宣-l8e
    @蔡小宣-l8e 2 роки тому

    Brief and clear! Thank you. 简短,清晰!

  • @RiteshSingh-ru1sk
    @RiteshSingh-ru1sk 3 роки тому

    Wow this is the best video on Eigen Decomposition. Thanks a lot man!

  • @ImolaS3
    @ImolaS3 3 роки тому

    A superb explanation that i got the first time through. Liked and subscribed!

  • @tarunbhatia8652
    @tarunbhatia8652 3 роки тому +1

    best video on eigen val decomposition on any platform. Thanks man!

  • @olz6928
    @olz6928 17 днів тому

    Hey! This video is great and it has helped me a lot. As feedback I will tell you that when the video began everything was on the whiteboard. This felt really overwhelming to me. This might be something you want to think about in the future.

  • @yanwang248
    @yanwang248 4 роки тому

    This channel is extremely useful, thank you very much

  • @himanshu1179
    @himanshu1179 Місяць тому

    Beautifully explained Ritvik. 👍

  • @souravdey1227
    @souravdey1227 3 роки тому +2

    Such a succinct explanation.. can you just explain why we normalised the eigen vectors?

  • @luca7x689
    @luca7x689 3 роки тому

    Thank you so much. I always love to learn why things are important. Makes studying much more interesting :)

  • @usama57926
    @usama57926 3 роки тому +1

    Beautiful explanation........ Thanks.............

  • @amritpalsingh6440
    @amritpalsingh6440 3 роки тому +1

    Best help I found online. Thanks :)

  • @JosephRivera517
    @JosephRivera517 4 роки тому

    This gives a lot of information about the process of doing it and its value in data science. Thanks.

  • @kally3432
    @kally3432 3 роки тому +1

    I really love your explanations, really helpful

  • @jambulingamlogababu8914
    @jambulingamlogababu8914 Рік тому

    Thank you very much for your detailed answer with appropriate examples and its benefit

  • @sanjeetwalia5077
    @sanjeetwalia5077 4 роки тому

    I liked the video, very explanatory and understandable

  • @madhamj
    @madhamj 3 роки тому +1

    Love bro! This explanation was so clear

  • @abrhk96
    @abrhk96 4 роки тому +1

    You made it so easy to understand! Thank you!

  • @robertovolpi
    @robertovolpi 9 місяців тому

    Outstanding explanation!
    It is very difficult to find that subject in a linear algebra college textbook.

  • @yingma6770
    @yingma6770 3 роки тому +3

    Great explanation! Can you please give an example in machine learning or data science when we need to do the same linear transformation again and again?

  • @langwen8685
    @langwen8685 4 роки тому

    Amazing clear explanation! Love u dude! Thx a million!

  • @rahulvansh2390
    @rahulvansh2390 2 роки тому +2

    Only one doubt, what's the reason behind normalizing eigenvectors?
    Btw, your content, the way of explaining these scary concepts taught me something that even MIT lectures couldn't. Thank you so much sir, please keep making such videos!
    More power to you sir :)

    • @TheRohit901
      @TheRohit901 2 роки тому +2

      Because any scalar multiplied by a eigenvector also remains a eigenvector only, hence we generally take unit vector

  • @moritz4150
    @moritz4150 Рік тому

    thanks, very easy to follow you in your thought process. Helped me very much!

  • @deepak_kori
    @deepak_kori Рік тому

    OMG the application part was amazing😍

  • @ritvikdhupkar5861
    @ritvikdhupkar5861 4 роки тому

    Great Clear explanations... Thanks a lot!

  • @enozeren
    @enozeren Рік тому

    Great short explenation! Thanks!

  • @baraaa.2338
    @baraaa.2338 4 роки тому +1

    Awesome Explanation.. Keep it up!

  • @bungercolumbus
    @bungercolumbus 24 дні тому

    really well explained good job.

  • @y031962
    @y031962 4 роки тому

    thanks for posting it, it would have been nicer to show how matix to the power is used in data science.

  • @younesarabnedjadi4419
    @younesarabnedjadi4419 2 роки тому

    OMG, literally understood the eigen shit in 8 minutes, thank you so much

  • @ramsaini3745
    @ramsaini3745 4 місяці тому

    your videos are helpful and concise at the same time, thats rare on today's yt

  • @mahdijavadi2747
    @mahdijavadi2747 3 роки тому

    Thanks a lot for this clear explanation!

  • @YTGiomar
    @YTGiomar 10 місяців тому

    Damn, just a good video. Thank you very much for explaining

  • @abhinavmishra9401
    @abhinavmishra9401 4 роки тому +1

    Thanks a lot. This was sublime.

  • @ekaterinaburakova8629
    @ekaterinaburakova8629 11 місяців тому

    Wow, such a good explanation!

    • @ritvikmath
      @ritvikmath  11 місяців тому

      Glad it was helpful!

  • @CStrik3r
    @CStrik3r 4 роки тому +1

    Great explanation !

  • @jatinkumar4410
    @jatinkumar4410 3 роки тому +1

    Thanks...Very nice explanation...

  • @tanmaygupta8288
    @tanmaygupta8288 11 місяців тому

    thankyou so much, u are a saviour

  • @suvikarhu4627
    @suvikarhu4627 2 роки тому

    7:54 Shouldn't you do the rightmost multiplication first? Lambda * U inverse.

  • @보라색사과-l1r
    @보라색사과-l1r 4 роки тому +1

    Thank you for this amazingly simple explanation!
    Could you give me an example of that kind of multiplication used in Machine Learning?

  • @sgifford1000
    @sgifford1000 4 роки тому +1

    You have a great channel! Thanks for the insight which is hard to come by. Just one confusing area to me at the time was the definition of the 2x2 matrices for u1 and u2. They look like 3x2 matrices with values 1 & u1 (or u2). I did figure it out though. Thanks!

    • @ritvikmath
      @ritvikmath  4 роки тому

      Thank you!

    • @Galmion
      @Galmion 2 роки тому +1

      can you elaborate on this? I still don't get how it isn't a 3x2 matrix.

    • @Arycke
      @Arycke Рік тому

      ​@@Galmionit shouldn't have been written in the way it was in my opinion, as it causes confusion. Those "1's" are just dot dot dots, ..., meant to be arbitrary entries

    • @Arycke
      @Arycke Рік тому

      ​@@Galmionthe Matrix U is the 2 eigenvectors, u1 and u2, put next to each other in one matrix. And since u1 and u2 are 2x1 vectors, putting them together in a matrix makes it a 2x2

    • @Arycke
      @Arycke Рік тому

      ​@@GalmionI would have chosen an example with no square roots as the first example personally. Say your eigenvectors are
      u1= [2]
      [3]
      u2 = [4]
      [5]
      Then U, the eigenvector matrix:
      U = [2 4]
      [3 5]
      Hope this helps.

  • @shashankelsalvador
    @shashankelsalvador 2 роки тому

    Best intro ever

  • @user-wr4yl7tx3w
    @user-wr4yl7tx3w 2 роки тому

    But are most matrices decomposable to eigendecomposition? Then doesn’t that mean limited use?

  • @香港地舖購物
    @香港地舖購物 2 роки тому

    Great video ! Can you also touch on the topic of LU Decomposition, Jordan Canonical Form, Rayleigh quotient, etc. ?

  • @bashiruddin3891
    @bashiruddin3891 3 роки тому +1

    very nice explanation

  • @balazsbaranyai8115
    @balazsbaranyai8115 4 роки тому

    Man, this rocks! thank you!

  • @boryanasvetichkova8458
    @boryanasvetichkova8458 Рік тому

    Great video, thanks!

  • @rozhanmirzaei3512
    @rozhanmirzaei3512 3 місяці тому

    Love this. Thank u❤

  • @amirhosseinmirkazemi765
    @amirhosseinmirkazemi765 4 роки тому

    You are AWESOME! thank you!

  • @UsmanAbbas-k2c
    @UsmanAbbas-k2c 11 місяців тому

    Super helpful. Thanks

  • @rimshasardarsardarrimsha3209
    @rimshasardarsardarrimsha3209 3 роки тому

    Can y tell me about what are the pros of this topic?

  • @yangwang9688
    @yangwang9688 3 роки тому +1

    What is the difference between decomposition and factorisation?

    • @ritvikmath
      @ritvikmath  3 роки тому

      I think they're often used interchangeably

  • @heejuneAhn
    @heejuneAhn 4 роки тому

    Your explanation is the best I have ever seen. But your explanation does not explain what each component really means, ie. The First U^-1, map/rotate the input vectors, and then stretch the result in each eigenvector direction and then finally reverse-rotate the vector (restoring into the original axis).

  • @hassanshahzad3922
    @hassanshahzad3922 3 роки тому +1

    this is awesome!

  • @rohanchess8332
    @rohanchess8332 Рік тому

    Why do we need normalized eigenvectors? won't any eigenvectors from the family of eigenvectors suffice

  • @elahedastan4945
    @elahedastan4945 2 роки тому

    great explanation

  • @thomasstiglich3484
    @thomasstiglich3484 3 роки тому

    Great video!

  • @michael8899aspen
    @michael8899aspen 3 роки тому

    iF P=6 OR p=7 is this arbitrary p=8?

  • @gvbvwockee
    @gvbvwockee 4 роки тому +1

    Thank you. Thank you. Thank you.

  • @DRmrTG
    @DRmrTG Рік тому

    Thanks for your help!

  • @NinjaAdorable
    @NinjaAdorable 4 роки тому +1

    That was beautiful !!!! :')

  • @Fat_Cat_Fly
    @Fat_Cat_Fly 4 роки тому +1

    Fantastic!!!!!!!!!!!!!!!!!!

  • @tehminakakar8753
    @tehminakakar8753 Рік тому

    hey did anyone solve for the eigenvectors?
    Maybe I am wrong, I got x1 = -2/3 x2 and x2 = -3/2 x1 when solving the equations for lamda = -5.
    if anyone got the answer please let me know.

  • @rugahun
    @rugahun 2 роки тому

    pff great video, i feel bad i didnt knew this guy erlier, saves a lot of time.

  • @yunkkim1159
    @yunkkim1159 Рік тому

    great job , I had no idea before the video now I know everything

  • @williamjmccartan8879
    @williamjmccartan8879 3 місяці тому

    Great job, peace

  • @alejandropalaciosgarcia2767
    @alejandropalaciosgarcia2767 3 роки тому

    Excellent

  • @jneal4154
    @jneal4154 10 місяців тому

    Excellent. I was struggling to understand how the form A=ULU^-1 is reached from the definition of an eigenvalue (Au=lu) as explained in my textbook, but the way you explained it made it all click for me. Thanks!

  • @marc2752
    @marc2752 2 роки тому +1

    awesome thanks

  • @zddmsmify
    @zddmsmify 4 роки тому

    Ojalá me lo hubieran explicado así de fácil cuando lo estudiaba hace casi 30 años

    • @zddmsmify
      @zddmsmify 4 роки тому +1

      Explicación excepcional

    • @ritvikmath
      @ritvikmath  4 роки тому

      Gracias por las amables palabras!

  • @davidfield5295
    @davidfield5295 Рік тому

    Good video

  • @satyamgupta4808
    @satyamgupta4808 Рік тому

    very nice

  • @svengunther7653
    @svengunther7653 3 роки тому

    Thanks man!

  • @thedailyepochs338
    @thedailyepochs338 3 роки тому

    Beautiful

  • @mydodethailung395
    @mydodethailung395 Рік тому

    Amazing

  • @NeoZondix
    @NeoZondix 2 роки тому

    Thanks

  • @klingefjord
    @klingefjord 4 роки тому

    Hang on, if a matrix times its inverse is the identity matrix, why can't the formula for eigendecomposition (U * lambda * U^-1) be simplified as just lambda?

    • @Pukimaxim
      @Pukimaxim 4 роки тому

      You cannot rearrange the equation with matrices multiplication as you would with numbers/variables

    • @Rudolf-ul1zh
      @Rudolf-ul1zh 4 роки тому

      Exactly, matrix multiplication is not commutative!

  • @gelvis11.11
    @gelvis11.11 Рік тому

    Thank you

  • @kiracite
    @kiracite Рік тому

    10/10 ty

  • @krishnachauhan2850
    @krishnachauhan2850 3 роки тому

    Awsome

  • @yashjain6372
    @yashjain6372 2 роки тому

    nice

  • @Shayan7755
    @Shayan7755 2 роки тому

    damn i like you, good job

  • @Arycke
    @Arycke Рік тому

    SVD is superior imo

  • @abdelrahmanelbeltagy3942
    @abdelrahmanelbeltagy3942 2 роки тому

    thanks*10^10000

  • @indianmovierecaps3192
    @indianmovierecaps3192 3 роки тому

    thanksssssssssssssssssssssssssssssssssssssssss

  • @marcoloya
    @marcoloya 3 роки тому

    Coool

  • @newmanokereafor2368
    @newmanokereafor2368 3 роки тому

    Beautiful and handsome and pretty and