Principle Component Analysis(PCA) | Part 3 | Code Example and Visualization
Вставка
- Опубліковано 16 лип 2024
- In this video, we provide a code example and visualization to showcase how to implement PCA in Python. Follow along and see the power of PCA in action, simplifying data and enhancing visualization for better insights.
Code used: github.com/campusx-official/1...
TASK PCA: colab.research.google.com/dri...
Task PCA Solution : colab.research.google.com/dri...
============================
Do you want to learn from me?
Check my affordable mentorship program at : learnwith.campusx.in/s/store
============================
📱 Grow with us:
CampusX' LinkedIn: / campusx-official
CampusX on Instagram for daily tips: / campusx.official
My LinkedIn: / nitish-singh-03412789
Discord: / discord
E-mail us at support@campusx.in
⌚Time Stamps⌚
00:00 - Practical Example on MNIST Dataset
01:30 - PCA Code Demo
21:50 - 2D Visualization
28:50 - Explained Variance
30:48 - Finding Optimum number of principle components required
34:34 - PCA Code Demo
39:35 - When does PCA not work?
43:00 - Outro
Honestly you deserve Bharat Ratna . Teachers like you are rare as heck !!
I thank the stars for finding your channel. If I ever make something of myself in this field I won't forget you and would help you in any way I can.
Dhanyavaad dil se
To contribute, you can join his paid service. Just a thought.
@@amitnakrani you're right, we all should do the same. It will be the best form of appreciation to him as well as encouragement to produce more resources in future.
bharat ratna ?
@@vinayakmane7569 🤣🤣🤣🤣
And madan puraskar too love from Nepal
A lot of beautiful statistical concepts were left from this video. eg why eigenvectors are orthogonal, etc.
PCA has never been clearer . The three videos of PCA in this playlist is very powerful and has definitely understand the PCA better
Not entirely though. you can go much deeper by understanding the covariance matrix is symmetric positive semi-definite this implies that the eigenvectors are orthogonal and Eigenvalues are all real
@@Raj-gc2rc honestly not required any deeper than this for my regular professional data science work . Thanks for your concern though
@@ankitbiswas8380 it depends on what work you are doing .... which application are you working on ?
You are not a teacher a you are absolutely genius.................
For the first time i can say that I am enjoying learning , especially coding .. All thanks to you sir .
In the last part of video, where you explained how to count the number of components to keep by plotting a graph, I want to add that it can be done very easilt, PCA n_components argument that takes number of components to create also takes in a float value like 0.80 or 0.90, which automatically tells PCA keep upto the number of components that explains this much variance f data, reference of is book "Hands on Machine Learning with Scikit Learn chapter PCA", Thank you, all three videos excellently cover the topic.
Great job mate!
Can you please start time series and it's algorithms. In many of the interviews I attended, the client expectation is that time series is a fundamental concept and one should be adroit in time series problems. Surprisingly, very few trainers on UA-cam are exploring this topic and that too just on the surface.
I request you to please start time series lectures as it's very intuitive topic and you're exceptional in explaining complex topics.
Regards
Are they expecting from us to know theory of random variables and random process ... like wide sense stationary , ergodic process in the time series interviews
It was just a perfect explanation of pca, Loved it.
Without pca on wine data accuracy score 94% with pca 99perc ❤
You are the best I can arrure you. There is not a single channel on youtube on ML and data science that can beat you.
till now you are the best in explanation💚💚🔥...you dont skip even 1 single point
Awesome and information video, great job bro.. Looking for more from you on Deep learning as well.. Thanks a lot..
This is really wonderful Explanation...tysm sir for providing such a valuable vedio
Thank You Sir.
Wonder sir, you explanation is wonderful
Awesome video. Thank you 😍
Brilliant as always. PCA can also be added as a pipeline component in the sklearn pipeline video that you made!
thanku sir for such a great explanation 🙏
I even forget to comment, your videos are that much interesting!!!!!! love from Pakistan.😍
great job sir ji
Thank you so much
Nice Explanation ❤❤
Bhai ki explanation 🙌❤
Amazing!!
Simply.. Awslme
One of the main drawback of PCA is: you lose your original features and hence interpretability and explainability
amazing
Thank you
Bhai Best Video Hai🥰🥰🥰🥰🥰🥰
Great 😃
thanks
sir best video
Sir can you please make video on Binary particle swarm optimisation and how it is used for feature selection.
Thank you so much😭😭😭😭
This guy is a Gem. Getting such amazing quality content for data science that too in Hindi is almost a miracle.
Sir.. Please can you tell me where I can find deployment of ml model using flask and heroku, preferred dl model... Please reply
How can we run online predictions on model trained using features extracted by PCA. How to apply pca transform in this case?
Is it possible to have an example of pictures to classify them into two categories?
If the dimensions are reduced in pca and classification in knn is better , please
One question remains, if we have a depolyed model, how the PCA would be applied on it?? Because, at that time there won't be any matrix (here we have test data so it can calculate test data's cov matrix's eigen vector and eigen value and multiply with the test data) Only observation points will come as vector.. the same question arises while doing the mean centering or standard scaling..
Please let me know if there is any clarity required in that question.
best
Sir last me Jo ap ne data ki example di thi k un data me help nhi krega tow wo plot kaise kre ge data pr Sirf y label data ko dekhna hota hai ya sare x features data pr plot kr k dekhna hoga k data ka shape
Please tell me
Sirjee, LDA and t-SNE to reh gaya PCA ke baad, pls uske videos ahi bana deejeye !
loved the visualization felt that happiness
Sir ye plotting ka code yd krna hota h kya?
you deserve Bharat Ratna
Hey, PCA(n_components=3) is not working ??? showing this error: ValueError: n_components=3 must be between 0 and min(n_samples, n_features)=2 with svd_solver='full' (what to do please help)
Sir, Kaggle code is not available here. Only previous video github link available.
love from Pakistan
PCA is applicable for continuous or discrete data ;can we use it for catagorical data in this video you use it for catagorical data which i think is not correct
Sir, This specific class source code is not available on github. please share the source code.
#CampusX How would PCA will know which features should be excluded? Or, how could I know which features are used to process @16:44 ?
features are selected by PCA on the basis of Eigen values and Eigen vectors, which is explained in the last video. We always select n_estimator where accuracy is higher. we can run the loop in intervals of 100,200,300 and so on to check where the n_estimator gives the best accuracy rate.
Please reply sir....
Can we use PCA to reduce Dimensionality of highly non-lenear dataset?
no
I need pca code in spyder 3.8
Sir Code & dataset(Mnist ) are missing , pls reload the same ;-)
sir where is topic LDA
so difficult
Sir , regression task me PCA apply ho sakta h kya ???
Yes bro
not in non-linear dataset but yes in linear dataset.
English Please.............................
Sir, the thing is I am much more comfortable teaching in hindi. So its a conscious decision from side. I am really sorry if you are facing inconvenience. I hope you understand
@@campusx-official Sure, I get you. Thanks anyways
Sir, please continue in hindi
If you want English why dont you switch to other videos, UA-cam is already filled with English ...
We need this content in Hindi that is very rare..👍👍👍🔥🔥❤️❤️
@@campusx-official Ha sir, Aap Hindi hee continue Rakhiye 😊😊😊🙂🙂🙂
I need pca code in spyder 3.8