4 imp points from video -- 1. pca solves the problem of overfitting 2. pca reduces high dimensionality dataset to low dimensionality 3. the number of pcs can be less than or equal to the number of attributes. although pc also depend on other factor such as dimensionality. 4. pcs should be orthogonal that is should be independent from each other
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase. 02:18 Principal Component Analysis (PCA) helps reduce overfitting 03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components 04:36 Principal components can be found using views to analyze the data from different perspectives. 05:45 The model generated two principal components: PC1 and PC2. 06:54 Principal components can be generated from multiple attributes and reduce the dimensionality 08:03 Give highest importance to PC1 and reduce priority for other principal components. 09:07 Principal Component Analysis (PCA) explained in a nutshell
Huge respects sir !!! You are surely 100% better than those University lecturers !! Because of u I can easily clear my concepts of ML, ERTOS, ICS ! Thank you so much for the help !!! I really appreciate that you are doing this with no returns and just giving away free education !! Hats off !!!!
PCA: Need: overfitting, many attributes and features before training need to reduce Pca reduce overfitting, model is trying to reach every point in overfitting, High Dimensionality to low dimensionality. Views: from top PC1, from another point PC2, PC1 Higher priority, Pc1 and pc2 must have orthogonal property i.e. independent of each other.
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase. 02:18 Principal Component Analysis (PCA) helps reduce overfitting 03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components 04:36 Principal components can be found using views to analyze the data from different perspectives. 05:45 The model generated two principal components: PC1 and PC2. 06:54 Principal components can be generated from multiple attributes and reduce the dimensionality 08:03 Give highest importance to PC1 and reduce priority for other principal components. 09:07 Principal Component Analysis (PCA) explained in a nutshell Crafted by Merlin AI.
Topic search karte waqt kuch topics se releted apke videos milte nhi hai lekin ek video bi agar mil jaye toh bas baat khatam chere pe ek alag khushi ho jati hai 😁😁
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase. 02:18 Principal Component Analysis (PCA) helps reduce overfitting 03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components 04:36 Principal components can be found using views to analyze the data from different perspectives. 05:45 The model generated two principal components: PC1 and PC2. 06:54 Principal components can be generated from multiple attributes and reduce the dimensionality 08:03 Give highest importance to PC1 and reduce priority for other principal components. 09:07 Principal Component Analysis (PCA) explained in a nutshell Crafted by SUBROTO
Question: When we cast our attributes on PC1, all the attributes get casted on the line, same goes for PC2. all the points get casted on PC2. Then how are they independent? We can find the same point on PC1 as well as PC2 (my assumption).
According to Andrew ng machine learning course use of pca should be done for increasing the speed of learning algorithm instead of preventing over fitting use regularisation to prevent overfitting
salute sir !!! you explained very nice compare to university teacher i can clear my concept of ml thankyou sir hudge respects for you sir keep it up!! sir
Well i must appreciate......your work.......I just wanna thank you of reducing time and coming to the point .......... I want video on regularization ....plz 😄 😄 😄
Sir I've watched your maximum possible videos. So I think no one is better than you. And now we need the video of "find S algorithm" & "candidate elimination" video. 🙏If it's possible so please sir this is my humble request pls make this video🙏
Great Video. Few points needs clarification like? 1. Why the only PC1 will b considered that means always 1 view is considered. 2. What is the view exactly? 3. How being orthogonal make them different? (Is it orthogonal properties ?) 4. (Most impt) Just by having different view how the features are reduced. arn't we still putting all the features to training? (This explanation was abstract. A bit technical would have done wonders.
4 imp points from video --
1. pca solves the problem of overfitting
2. pca reduces high dimensionality dataset to low dimensionality
3. the number of pcs can be less than or equal to the number of attributes. although pc also depend on other factor such as dimensionality.
4. pcs should be orthogonal that is should be independent from each other
Thanks bro
Thanks
You saved my lots of time
5th point aj ka video bada kamaal ka hone wala hai😂😂
Thank you for saving our career ❤️
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
02:18 Principal Component Analysis (PCA) helps reduce overfitting
03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
04:36 Principal components can be found using views to analyze the data from different perspectives.
05:45 The model generated two principal components: PC1 and PC2.
06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
08:03 Give highest importance to PC1 and reduce priority for other principal components.
09:07 Principal Component Analysis (PCA) explained in a nutshell
How come I end up finding the best teachers on UA-cam one day before my exam. Haha
Because we start searching for videos only one day before the exam😂
@@vishnum9613 right 😂 6 hours remaining and it's 3:24 am😂
😂coz we are not worried about stuffs untill they are very close to us
But why do you guys wait for the last moment??
It's a talent possessed only by back benchers 😂🤣🤣
Huge respects sir !!! You are surely 100% better than those University lecturers !! Because of u I can easily clear my concepts of ML, ERTOS, ICS ! Thank you so much for the help !!! I really appreciate that you are doing this with no returns and just giving away free education !! Hats off !!!!
I never knew Rohit Sharma was this good at ML. Way to go champ
PCA:
Need: overfitting, many attributes and features before training need to reduce
Pca reduce overfitting, model is trying to reach every point in overfitting, High Dimensionality to low dimensionality.
Views: from top PC1, from another point PC2, PC1 Higher priority, Pc1 and pc2 must have orthogonal property i.e. independent of each other.
He is best man!!
Amazing learning videos. During every exam paper he is there to help.
Thanka sir more power to you!
Brother thanks yaar itna simple tareke se padha dete ho ki maza aa jata hai...please bhaiyon like karo isse aur subscribe bhi ....thanks yaar.
Sir you are look like desi gamers "amit bhai" 😂
yas!
good explanation buddy
apne hi pet par laat padne par maze aare hai tumko
at least he is supporting the better content without being arrogant! peet pe laat wali baat ni h.
Wha LMT ki comment ❤️
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
02:18 Principal Component Analysis (PCA) helps reduce overfitting
03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
04:36 Principal components can be found using views to analyze the data from different perspectives.
05:45 The model generated two principal components: PC1 and PC2.
06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
08:03 Give highest importance to PC1 and reduce priority for other principal components.
09:07 Principal Component Analysis (PCA) explained in a nutshell
Crafted by Merlin AI.
Topic search karte waqt kuch topics se releted apke videos milte nhi hai lekin ek video bi agar mil jaye toh bas baat khatam chere pe ek alag khushi ho jati hai 😁😁
100% satisfaction is guaranteed on a topic while watching your videos sir.
Thank you so much
Thanks!
Best Tutorial Found on UA-cam...!!
01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
02:18 Principal Component Analysis (PCA) helps reduce overfitting
03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
04:36 Principal components can be found using views to analyze the data from different perspectives.
05:45 The model generated two principal components: PC1 and PC2.
06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
08:03 Give highest importance to PC1 and reduce priority for other principal components.
09:07 Principal Component Analysis (PCA) explained in a nutshell
Crafted by SUBROTO
Happy teacher's day 💐💐
It was so much confusing topic and you made it so much easy.., thanks a ton sir..
Question: When we cast our attributes on PC1, all the attributes get casted on the line, same goes for PC2. all the points get casted on PC2. Then how are they independent? We can find the same point on PC1 as well as PC2 (my assumption).
My favourite UA-cam channel is because it always reduces my stress or tension of exam😊😊
Sir/bhaiya, you explaining things so good ..even free
you are the best teacher
Not only engineering...it's for also geography ❤️
If u watch sir at 1.5x , you'll jus love the energy.
I am already loving it ❤️ ....at 1.5x though 😂
I seriously don't know how you have such less subscriber , you are a life saver and obviously a good teacher/ecplainer 🙏🙏🙏
Keep up the good work
This is video is far better than my college professors lecture.
Sir ,you are really Superb..👍please continue all this.👏👏👏👏👏👏⚘⚘⚘⚘
sir please continue making videos , your channel is literally a gold mine . Ap hmare US ki university k professor say kahe zeada acha para rahy ho.
Tu US ke university proffessors se padh raha hai toh youtube pe kya kar raha hai bhai?
Wow....nice
You explained a very complicated idea in very easy tips. Thanks brother. Shaanti rahain.
Appreciate your effort. Your videos are very informative and easy to understand.
Awesome Sir,
A vigorous teacher, Quality Unmatched.
aur iqbal bazmi sahab, aao kabhi room par
THANKYOU SIR i just love the energy with which you teach. thankyousomuch sir you are a great teacher.
Great explanation sir ..thank you sir ☺️
good to hear such informative video
bhai maza aa gya....
According to Andrew ng machine learning course use of pca should be done for increasing the speed of learning algorithm instead of preventing over fitting use regularisation to prevent overfitting
salute sir !!! you explained very nice compare to university teacher i can clear my concept of ml thankyou sir hudge respects for you sir keep it up!! sir
Ek hi toh dil hai kitni baar jeetoge sir😂
Very good explanation in short time... 👍👍
Sir i watched each videos of your channel for my 8th sem final papers they are helping me a lot and i'm from rgpv thank you so much
Waah bc
Kitne marks paye?
@@PranavKumar1991 75 above
Huge respects sir !! Thank you sir
awesome means awesome explaination sir thanku so much
Well i must appreciate......your work.......I just wanna thank you of reducing time and coming to the point .......... I want video on regularization ....plz 😄 😄 😄
Wahh Bhai wahh
well done bro. kamal videos hein.
Your vedio is very very much helpful for my samester exams.
Thanku so much sir...
*Video. not Vedio.
Amazing Effort !! 😄 😄
Thanks a lot for this video💯💯💯💯💯💯💯💯
Sir I've watched your maximum possible videos. So I think no one is better than you.
And now we need the video of "find S algorithm" & "candidate elimination" video.
🙏If it's possible so please sir this is my humble request pls make this video🙏
5 Mins engineering, gate smashers and Sanchit sir are the life saviors 🌚🌚 lots of love....!!
true hehe 😹
I came just to understand PCA but I loved your way of explaning and now I am a new subsciber.
Yes sir...as time is less please only concentrate on important topics
i am going like all your videos
best video sir its help me a lot
Superb explanation.
Mind blowing sir
Love you brother 🌻🌻🌻
THANKS MANNNNNN
Very good ML content,people are giving so much money in ML course not looking in this content
Love From IIT Dholakpur Sir
sir thank you very much, you explain very well.
why i didn't watched this before , very helpful .thankyou
Thank you so much sir...very useful video sir...😊
Nice content its really helps me alot ..... Request you to keep uploading videos .. more n more .. Once again thank you so much
Thank you Sir, videos from this series helped me get a clear understanding about the concepts. Keep making such videos , they sure help a lot.
dimag ki batti galat jal gai was epic sir
Excellent Explanation, Thank U sir
There's a small silly mistake it's "Principal" Not "Principle"
Principle he sahi hai
Ok
Pin it sir
Best and simplest possible explanation.
very easily explained and easy to understand , amazing ! keep up the good work :)
superb video
Please do a video in SVD (Singular Value Decomposition) . I really love your videos very useful . Thank you soo much
thanx brthr it was very helpful you gained a subscriber
sir you deserve the nobel prize. your way of explaining is so amazing.
Very Nice and precise explanation. You did a lot of home work on PCA in making precise. Thank you
awesome i loved it...
It was nice and simple explaination
Excellent
Thank you so much sir,....you are great.😍
sabko pass karayega apna phaizal 5 minute mai
sahi bataya bhaiya
Great Video. Few points needs clarification like?
1. Why the only PC1 will b considered that means always 1 view is considered.
2. What is the view exactly?
3. How being orthogonal make them different? (Is it orthogonal properties ?)
4. (Most impt) Just by having different view how the features are reduced. arn't we still putting all the features to training? (This explanation was abstract. A bit technical would have done wonders.
mja aa gya itna to mene thin salo me nhi sikha
Sirrrrr firr aa gya mai .
great explanation
Superb Explanation !
Thank you so much sir 🙏
I understood everything thank you so much 👌👌
Thank you sir ☺️🙏
Awesome work!
Thanks for sharing.... Good efforts
Hum e aur pass kar neh ki chahat na hoti, aagar tum na hote, aagar tum na hote 🙏🙏🙏🙏
Best and simplest explanation ....can u suggest any book or PDF of pca
Thank you sir.. You are doing a great job
Andrew Ng - Do not apply pca to reduce overfitting,
Le 5 min engineering - hold my opening statement
❤️thank you sir..great explanation
Can you please upload practical of pca
Thanks also from Agricultural side
Easy explanation