PCA explained with intuition, a little math and code

Поділитися
Вставка
  • Опубліковано 3 гру 2024

КОМЕНТАРІ • 23

  • @exoticcoder5365
    @exoticcoder5365 Рік тому +2

    I really found that little coffee bean animation helped me to concentrate on the video ! Thank you for this strategy !

  • @matt.jordan
    @matt.jordan 3 роки тому +6

    Such a great explanation awesome work!!

  • @EGlobalKnowledge
    @EGlobalKnowledge 2 роки тому +1

    A very good explanation with details of how it works

  • @quote.d
    @quote.d 2 роки тому +2

    Thanks! Great explanation and visuals, and I especially enjoyed whispered parts. I'm sure I'm not the only one who gets this emotional reaction to information that is being whispered instead of plainly said. And emotions are very important for remembering things. Please consider making a full-whispered redub of your videos!

    • @AICoffeeBreak
      @AICoffeeBreak  2 роки тому +2

      ASMR with Machine Learning content. 😅

  • @vincetechclass3390
    @vincetechclass3390 Рік тому +1

    Nice presentation. Pls, what tool did you use for presentation?

    • @AICoffeeBreak
      @AICoffeeBreak  Рік тому +1

      Thanks! I used good old Powerpoint. 😅

  • @omarlopezrincon
    @omarlopezrincon 2 роки тому

    ha ha ha, i was hoping to learn how to calculate the eigen vector, I love this channel

  • @chenzakaim3
    @chenzakaim3 2 роки тому +1

    you are really awesome!, thanks a lot

    • @AICoffeeBreak
      @AICoffeeBreak  2 роки тому +1

      🙂 Thanks for watching and leaving this awesome comment!

  • @KevinTurner-aka-keturn
    @KevinTurner-aka-keturn 2 роки тому

    I tried doing some dimensionality reduction using yellowbrick and sklearn on what I _thought_ was a very modestly-sized data set, and I was surprised by how long it took! I guess it was probably the Manifold Learning methods that took longer than PCA, but I don't recall PCA being exactly quick either.
    Is that expected? Are there techniques for subsampling data to get some faster approximation?

  • @kellymarchisio377
    @kellymarchisio377 2 роки тому

    First off - loving the videos! Thanks for the fun and clear explanations. Quick clarification, though: The matrix V at 5:22 is drawn as a D' x D, no? Are we meant to actually have z_i = x_i V^T (V-transpose)?

  • @user-or7ji5hv8y
    @user-or7ji5hv8y 4 роки тому +3

    I replaced my morning coffee with AI coffee. :)

  • @dontaskme1625
    @dontaskme1625 3 роки тому +2

    Wouldn't a coffee bean be afraid of a coffee break because that's the point in time when it would be most likely ground up?

    • @AICoffeeBreak
      @AICoffeeBreak  3 роки тому +6

      Shhh!!! Don't tell Ms. Coffee Bean that! 🤫😱

  • @ricardoabraham4016
    @ricardoabraham4016 2 роки тому +1

    thank u
    this is so cute

  • @rumanubhardwaj6559
    @rumanubhardwaj6559 4 роки тому +3

    Funny? Ya(y.)