Principal Component Analysis (PCA) | Dimensionality Reduction Techniques (2/5)

Поділитися
Вставка
  • Опубліковано 31 тра 2024
  • ▬▬ Papers / Resources ▬▬▬
    Colab Notebook: colab.research.google.com/dri...
    Peter Bloem PCA Blog: peterbloem.nl/blog/pca
    PCA for DS book: pca4ds.github.io/basic.html
    PCA Book: cda.psych.uiuc.edu/statistical...
    Lagrange Multipliers: ekamperi.github.io/mathematic...
    PCA Mathematical derivation #1: www.quora.com/Why-does-PCA-ch...
    PCA Mathematical derivation #2: towardsdatascience.com/princi...
    PCA Mathematical derivation #3: rich-d-wilkinson.github.io/MA...
    PCA Mathematical derivation #4: stats.stackexchange.com/quest...
    PCA Mathematical derivation #5: / geometrical-and-mathem...
    Eigenvectors and Eigenvalues: sebastianraschka.com/Articles...
    Image Sources:
    - Eigenfaces: towardsdatascience.com/eigenf...
    - Hyperplane: www.analyticsvidhya.com/blog/...
    ▬▬ Support me if you like 🌟
    ►Link to this channel: bit.ly/3zEqL1W
    ►Support me on Patreon: bit.ly/2Wed242
    ►Buy me a coffee on Ko-Fi: bit.ly/3kJYEdl
    ►E-Mail: deepfindr@gmail.com
    ▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
    Music from #Uppbeat (free for Creators!):
    uppbeat.io/t/sulyya/weather-c...
    License code: ZRGIWRHMLMZMAHQI
    ▬▬ Used Icons ▬▬▬▬▬▬▬▬▬▬
    All Icons are from flaticon: www.flaticon.com/authors/freepik
    ▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
    00:00 Introduction
    00:26 Used Literature
    00:41 Example dataset
    02:47 Variance
    03:52 Projecting data
    04:14 Variance as measure of information
    05:15 Scree Plot
    05:53 Principal Components
    06:14 PCA on images
    07:00 Reconstruction based on eigenfaces
    07:35 Orthogonal Basis
    08:12 Kernel PCA
    08:45 Finding principal components
    09:28 Distance minimization vs. Variance maximization
    10:45 Covariance Matrix
    11:35 Correlation vs. Covariance
    11:50 Covariance examples
    12:50 Linear Algebra Basics
    14:22 Eigenvectors and Eigenvalues
    15:30 Eigenvector Equation
    16:10 Spectral Theorem
    16:40 Connection between Eigenvectors and Principal Components
    17:23 [STEP 1]: Centering the Data
    17:54 [STEP 2]: Calculate Covariance Matrix
    18:25 [STEP 3]: Eigenvalue Decomposition
    19:05 How to find eigenvectors?
    19:17 The truth :O
    19:27 Singular value decomposition
    20:21 Why eigendecomposition at all?
    20:45 [STEP 4]: Projection onto PCs
    21:12 Orthogonal Eigenvectors
    21:49 Dimensionality Reduction Projection
    22:11 [CODE]
    24:52 Summary Table
    ▬▬ My equipment 💻
    - Microphone: amzn.to/3DVqB8H
    - Microphone mount: amzn.to/3BWUcOJ
    - Monitors: amzn.to/3G2Jjgr
    - Monitor mount: amzn.to/3AWGIAY
    - Height-adjustable table: amzn.to/3aUysXC
    - Ergonomic chair: amzn.to/3phQg7r
    - PC case: amzn.to/3jdlI2Y
    - GPU: amzn.to/3AWyzwy
    - Keyboard: amzn.to/2XskWHP
    - Bluelight filter glasses: amzn.to/3pj0fK2

КОМЕНТАРІ • 13

  • @yashdevarshi2583
    @yashdevarshi2583 4 місяці тому

    Amazing explanation! Wish my uni professors were like you

  • @TheZapalsky
    @TheZapalsky 5 місяців тому +1

    great content!

  • @anas.aldadi
    @anas.aldadi 5 місяців тому

    what a nice a cool channel to discover, i stumpled upon your channel searching for mathematical explaination for diffusion theory and model!

    • @DeepFindr
      @DeepFindr  5 місяців тому

      Thanks! Appreciated :)

  • @MegaBoss1980
    @MegaBoss1980 5 місяців тому +1

    In your future series, will you also cover PCA for categorical variables? Also, can we apply PCA on embeddings of categorical variables?

    • @DeepFindr
      @DeepFindr  5 місяців тому +1

      Hi :) for this series that's the only thing about PCA. Next videos will be about other techniques. It's mainly intended to get a good overview for each method.
      PCA is designed for continuous variables - all of the projections don't make too much sense for categorical data. That's mainly because distances are not properly defined. Of course it's possible to apply it anyways for example on one hot encoded variables, but it might not be the best choice. You might want to look into Multiple Correspondence Analysis (MCA), which is designed for categorical variables.

  • @FabioDBB
    @FabioDBB 2 місяці тому

    Amazing explanation dude
    Rome is way bigger than NYC btw

    • @DeepFindr
      @DeepFindr  2 місяці тому

      Thanks! Yeah area-wise it is but not population-wise, right?

  • @yeshiwangmo5920
    @yeshiwangmo5920 Місяць тому

    Do you have ppt on this

  • @shaz-z506
    @shaz-z506 5 місяців тому

    Nice one 😃, could you please extend this and explain kernel PCA in the similar manner, I don't think so there are many videos kernel PCA

    • @DeepFindr
      @DeepFindr  5 місяців тому

      Will put it on the list but can't promise :D

  • @YouKnowWhoIAm118
    @YouKnowWhoIAm118 5 місяців тому

    Hi, your explainable AI playlist could be updated ;) no offense bro, just as a suggestion

    • @DeepFindr
      @DeepFindr  5 місяців тому

      Hehe with which method? :)