Principal Component Analysis (PCA) | Dimensionality Reduction Techniques (2/5)
Вставка
- Опубліковано 31 тра 2024
- ▬▬ Papers / Resources ▬▬▬
Colab Notebook: colab.research.google.com/dri...
Peter Bloem PCA Blog: peterbloem.nl/blog/pca
PCA for DS book: pca4ds.github.io/basic.html
PCA Book: cda.psych.uiuc.edu/statistical...
Lagrange Multipliers: ekamperi.github.io/mathematic...
PCA Mathematical derivation #1: www.quora.com/Why-does-PCA-ch...
PCA Mathematical derivation #2: towardsdatascience.com/princi...
PCA Mathematical derivation #3: rich-d-wilkinson.github.io/MA...
PCA Mathematical derivation #4: stats.stackexchange.com/quest...
PCA Mathematical derivation #5: / geometrical-and-mathem...
Eigenvectors and Eigenvalues: sebastianraschka.com/Articles...
Image Sources:
- Eigenfaces: towardsdatascience.com/eigenf...
- Hyperplane: www.analyticsvidhya.com/blog/...
▬▬ Support me if you like 🌟
►Link to this channel: bit.ly/3zEqL1W
►Support me on Patreon: bit.ly/2Wed242
►Buy me a coffee on Ko-Fi: bit.ly/3kJYEdl
►E-Mail: deepfindr@gmail.com
▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
Music from #Uppbeat (free for Creators!):
uppbeat.io/t/sulyya/weather-c...
License code: ZRGIWRHMLMZMAHQI
▬▬ Used Icons ▬▬▬▬▬▬▬▬▬▬
All Icons are from flaticon: www.flaticon.com/authors/freepik
▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
00:00 Introduction
00:26 Used Literature
00:41 Example dataset
02:47 Variance
03:52 Projecting data
04:14 Variance as measure of information
05:15 Scree Plot
05:53 Principal Components
06:14 PCA on images
07:00 Reconstruction based on eigenfaces
07:35 Orthogonal Basis
08:12 Kernel PCA
08:45 Finding principal components
09:28 Distance minimization vs. Variance maximization
10:45 Covariance Matrix
11:35 Correlation vs. Covariance
11:50 Covariance examples
12:50 Linear Algebra Basics
14:22 Eigenvectors and Eigenvalues
15:30 Eigenvector Equation
16:10 Spectral Theorem
16:40 Connection between Eigenvectors and Principal Components
17:23 [STEP 1]: Centering the Data
17:54 [STEP 2]: Calculate Covariance Matrix
18:25 [STEP 3]: Eigenvalue Decomposition
19:05 How to find eigenvectors?
19:17 The truth :O
19:27 Singular value decomposition
20:21 Why eigendecomposition at all?
20:45 [STEP 4]: Projection onto PCs
21:12 Orthogonal Eigenvectors
21:49 Dimensionality Reduction Projection
22:11 [CODE]
24:52 Summary Table
▬▬ My equipment 💻
- Microphone: amzn.to/3DVqB8H
- Microphone mount: amzn.to/3BWUcOJ
- Monitors: amzn.to/3G2Jjgr
- Monitor mount: amzn.to/3AWGIAY
- Height-adjustable table: amzn.to/3aUysXC
- Ergonomic chair: amzn.to/3phQg7r
- PC case: amzn.to/3jdlI2Y
- GPU: amzn.to/3AWyzwy
- Keyboard: amzn.to/2XskWHP
- Bluelight filter glasses: amzn.to/3pj0fK2
Amazing explanation! Wish my uni professors were like you
great content!
what a nice a cool channel to discover, i stumpled upon your channel searching for mathematical explaination for diffusion theory and model!
Thanks! Appreciated :)
In your future series, will you also cover PCA for categorical variables? Also, can we apply PCA on embeddings of categorical variables?
Hi :) for this series that's the only thing about PCA. Next videos will be about other techniques. It's mainly intended to get a good overview for each method.
PCA is designed for continuous variables - all of the projections don't make too much sense for categorical data. That's mainly because distances are not properly defined. Of course it's possible to apply it anyways for example on one hot encoded variables, but it might not be the best choice. You might want to look into Multiple Correspondence Analysis (MCA), which is designed for categorical variables.
Amazing explanation dude
Rome is way bigger than NYC btw
Thanks! Yeah area-wise it is but not population-wise, right?
Do you have ppt on this
Nice one 😃, could you please extend this and explain kernel PCA in the similar manner, I don't think so there are many videos kernel PCA
Will put it on the list but can't promise :D
Hi, your explainable AI playlist could be updated ;) no offense bro, just as a suggestion
Hehe with which method? :)