Basics Of Principal Component Analysis Part-1 Explained in Hindi ll Machine Learning Course

Поділитися
Вставка
  • Опубліковано 28 гру 2024

КОМЕНТАРІ • 328

  • @artyCrafty4564
    @artyCrafty4564 Рік тому +160

    4 imp points from video --
    1. pca solves the problem of overfitting
    2. pca reduces high dimensionality dataset to low dimensionality
    3. the number of pcs can be less than or equal to the number of attributes. although pc also depend on other factor such as dimensionality.
    4. pcs should be orthogonal that is should be independent from each other

  • @sereto7867
    @sereto7867 2 роки тому +23

    Thank you for saving our career ❤️

  • @prathmeshphatake1948
    @prathmeshphatake1948 2 місяці тому +8

    01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
    02:18 Principal Component Analysis (PCA) helps reduce overfitting
    03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
    04:36 Principal components can be found using views to analyze the data from different perspectives.
    05:45 The model generated two principal components: PC1 and PC2.
    06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
    08:03 Give highest importance to PC1 and reduce priority for other principal components.
    09:07 Principal Component Analysis (PCA) explained in a nutshell

  • @basudhasakshyarika1592
    @basudhasakshyarika1592 3 роки тому +259

    How come I end up finding the best teachers on UA-cam one day before my exam. Haha

    • @vishnum9613
      @vishnum9613 3 роки тому +39

      Because we start searching for videos only one day before the exam😂

    • @lvl-x_Esport
      @lvl-x_Esport 3 роки тому +17

      @@vishnum9613 right 😂 6 hours remaining and it's 3:24 am😂

    • @doctorstrange4127
      @doctorstrange4127 2 роки тому +6

      😂coz we are not worried about stuffs untill they are very close to us

    • @iamrichaf1616
      @iamrichaf1616 2 роки тому

      But why do you guys wait for the last moment??

    • @ciycodeityourself6152
      @ciycodeityourself6152 2 роки тому +1

      It's a talent possessed only by back benchers 😂🤣🤣

  • @kaustubh7304
    @kaustubh7304 4 роки тому +36

    Huge respects sir !!! You are surely 100% better than those University lecturers !! Because of u I can easily clear my concepts of ML, ERTOS, ICS ! Thank you so much for the help !!! I really appreciate that you are doing this with no returns and just giving away free education !! Hats off !!!!

  • @DoomedVortex
    @DoomedVortex Рік тому +1

    I never knew Rohit Sharma was this good at ML. Way to go champ

  • @lunapotter5593
    @lunapotter5593 2 роки тому +4

    PCA:
    Need: overfitting, many attributes and features before training need to reduce
    Pca reduce overfitting, model is trying to reach every point in overfitting, High Dimensionality to low dimensionality.
    Views: from top PC1, from another point PC2, PC1 Higher priority, Pc1 and pc2 must have orthogonal property i.e. independent of each other.

  • @johnwicckk
    @johnwicckk 2 роки тому +6

    He is best man!!
    Amazing learning videos. During every exam paper he is there to help.
    Thanka sir more power to you!

  • @manujpande8544
    @manujpande8544 5 років тому

    Brother thanks yaar itna simple tareke se padha dete ho ki maza aa jata hai...please bhaiyon like karo isse aur subscribe bhi ....thanks yaar.

  • @RAKESH-ie1vb
    @RAKESH-ie1vb Рік тому +13

    Sir you are look like desi gamers "amit bhai" 😂

  • @Lastmomenttuitions
    @Lastmomenttuitions 5 років тому +70

    good explanation buddy

    • @pradumnasoni1652
      @pradumnasoni1652 5 років тому +4

      apne hi pet par laat padne par maze aare hai tumko

    • @Bhatonia_Jaat
      @Bhatonia_Jaat 3 роки тому +5

      at least he is supporting the better content without being arrogant! peet pe laat wali baat ni h.

    • @raghav042
      @raghav042 3 роки тому +1

      Wha LMT ki comment ❤️

  • @devangraut6916
    @devangraut6916 7 місяців тому +1

    01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
    02:18 Principal Component Analysis (PCA) helps reduce overfitting
    03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
    04:36 Principal components can be found using views to analyze the data from different perspectives.
    05:45 The model generated two principal components: PC1 and PC2.
    06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
    08:03 Give highest importance to PC1 and reduce priority for other principal components.
    09:07 Principal Component Analysis (PCA) explained in a nutshell
    Crafted by Merlin AI.

  • @ashwinbankar9
    @ashwinbankar9 Рік тому

    Topic search karte waqt kuch topics se releted apke videos milte nhi hai lekin ek video bi agar mil jaye toh bas baat khatam chere pe ek alag khushi ho jati hai 😁😁

  • @prasanthkumar6393
    @prasanthkumar6393 2 роки тому +4

    100% satisfaction is guaranteed on a topic while watching your videos sir.
    Thank you so much

  • @Baetu123
    @Baetu123 3 роки тому

    Thanks!

  • @faizejafri1014
    @faizejafri1014 5 років тому +4

    Best Tutorial Found on UA-cam...!!

  • @subrotodebnath7680
    @subrotodebnath7680 10 місяців тому +1

    01:09 PCA helps in overcoming the problem of overfitting caused by too many attributes and features during the training phase.
    02:18 Principal Component Analysis (PCA) helps reduce overfitting
    03:27 Principal component analysis helps in reducing overfitting by reducing dimensions and finding principal components
    04:36 Principal components can be found using views to analyze the data from different perspectives.
    05:45 The model generated two principal components: PC1 and PC2.
    06:54 Principal components can be generated from multiple attributes and reduce the dimensionality
    08:03 Give highest importance to PC1 and reduce priority for other principal components.
    09:07 Principal Component Analysis (PCA) explained in a nutshell
    Crafted by SUBROTO

  • @a-archanabichkule
    @a-archanabichkule Рік тому +2

    Happy teacher's day 💐💐

  • @pranay6708
    @pranay6708 5 років тому +6

    It was so much confusing topic and you made it so much easy.., thanks a ton sir..

  • @alimehmood8654
    @alimehmood8654 4 роки тому +13

    Question: When we cast our attributes on PC1, all the attributes get casted on the line, same goes for PC2. all the points get casted on PC2. Then how are they independent? We can find the same point on PC1 as well as PC2 (my assumption).

  • @sakshibagade7092
    @sakshibagade7092 4 місяці тому

    My favourite UA-cam channel is because it always reduces my stress or tension of exam😊😊

  • @nagraj0308
    @nagraj0308 3 роки тому

    Sir/bhaiya, you explaining things so good ..even free

  • @apurvaghodeswar9264
    @apurvaghodeswar9264 2 роки тому +1

    you are the best teacher

  • @benojiryasmin9174
    @benojiryasmin9174 3 роки тому +8

    Not only engineering...it's for also geography ❤️

  • @mr.curious1329
    @mr.curious1329 3 роки тому +2

    If u watch sir at 1.5x , you'll jus love the energy.
    I am already loving it ❤️ ....at 1.5x though 😂

  • @creator025
    @creator025 5 років тому +3

    I seriously don't know how you have such less subscriber , you are a life saver and obviously a good teacher/ecplainer 🙏🙏🙏
    Keep up the good work

  • @sahil2pradhan
    @sahil2pradhan 2 роки тому

    This is video is far better than my college professors lecture.

  • @ASh-hb1ub
    @ASh-hb1ub 4 роки тому

    Sir ,you are really Superb..👍please continue all this.👏👏👏👏👏👏⚘⚘⚘⚘

  • @sam9620
    @sam9620 4 роки тому

    sir please continue making videos , your channel is literally a gold mine . Ap hmare US ki university k professor say kahe zeada acha para rahy ho.

    • @siddheshbandgar6927
      @siddheshbandgar6927 3 роки тому

      Tu US ke university proffessors se padh raha hai toh youtube pe kya kar raha hai bhai?

  • @SB_Roy_Vlogs
    @SB_Roy_Vlogs Рік тому +1

    Wow....nice

  • @samiuddin6696
    @samiuddin6696 3 роки тому

    You explained a very complicated idea in very easy tips. Thanks brother. Shaanti rahain.

  • @saqib317
    @saqib317 Рік тому

    Appreciate your effort. Your videos are very informative and easy to understand.

  • @muhammadiqbalbazmi9275
    @muhammadiqbalbazmi9275 5 років тому +10

    Awesome Sir,
    A vigorous teacher, Quality Unmatched.

  • @sshubam
    @sshubam 2 роки тому

    THANKYOU SIR i just love the energy with which you teach. thankyousomuch sir you are a great teacher.

  • @seducation9982
    @seducation9982 Рік тому

    Great explanation sir ..thank you sir ☺️

  • @Fatima-pp5ue
    @Fatima-pp5ue 2 роки тому

    good to hear such informative video

  • @sharadpkumar
    @sharadpkumar 9 місяців тому

    bhai maza aa gya....

  • @vaibhavdiwan1569
    @vaibhavdiwan1569 4 роки тому +5

    According to Andrew ng machine learning course use of pca should be done for increasing the speed of learning algorithm instead of preventing over fitting use regularisation to prevent overfitting

  • @ompandya30
    @ompandya30 Рік тому

    salute sir !!! you explained very nice compare to university teacher i can clear my concept of ml thankyou sir hudge respects for you sir keep it up!! sir

  • @siddharthpatil1879
    @siddharthpatil1879 3 роки тому

    Ek hi toh dil hai kitni baar jeetoge sir😂

  • @banditasahoo9663
    @banditasahoo9663 4 роки тому

    Very good explanation in short time... 👍👍

  • @rickyraina8266
    @rickyraina8266 5 років тому +5

    Sir i watched each videos of your channel for my 8th sem final papers they are helping me a lot and i'm from rgpv thank you so much

  • @Hayat26474
    @Hayat26474 6 місяців тому

    Huge respects sir !! Thank you sir

  • @sudarshandev6369
    @sudarshandev6369 3 роки тому

    awesome means awesome explaination sir thanku so much

  • @sonalisingh2136
    @sonalisingh2136 5 років тому +3

    Well i must appreciate......your work.......I just wanna thank you of reducing time and coming to the point .......... I want video on regularization ....plz 😄 😄 😄

  • @Raag_Jhankaar
    @Raag_Jhankaar 3 роки тому

    Wahh Bhai wahh

  • @minhaaj
    @minhaaj 4 роки тому

    well done bro. kamal videos hein.

  • @JyotiSingh-rz2gg
    @JyotiSingh-rz2gg 4 роки тому

    Your vedio is very very much helpful for my samester exams.
    Thanku so much sir...

  • @bhavya2301
    @bhavya2301 4 роки тому

    Amazing Effort !! 😄 😄

  • @vedant6460
    @vedant6460 2 роки тому

    Thanks a lot for this video💯💯💯💯💯💯💯💯

  • @kaushalendrarathour9909
    @kaushalendrarathour9909 4 роки тому

    Sir I've watched your maximum possible videos. So I think no one is better than you.
    And now we need the video of "find S algorithm" & "candidate elimination" video.
    🙏If it's possible so please sir this is my humble request pls make this video🙏

  • @ritikarauthan3304
    @ritikarauthan3304 Рік тому +1

    5 Mins engineering, gate smashers and Sanchit sir are the life saviors 🌚🌚 lots of love....!!

  • @adityakumarmishra8734
    @adityakumarmishra8734 4 роки тому

    I came just to understand PCA but I loved your way of explaning and now I am a new subsciber.

  • @maxpayne880
    @maxpayne880 5 років тому

    Yes sir...as time is less please only concentrate on important topics

  • @nagraj0308
    @nagraj0308 3 роки тому

    i am going like all your videos

  • @rubina-hq3gc
    @rubina-hq3gc 3 роки тому

    best video sir its help me a lot

  • @freetube7767
    @freetube7767 5 років тому +1

    Superb explanation.

  • @silparaniswain5492
    @silparaniswain5492 Рік тому

    Mind blowing sir

  • @girijaprasadpatnaik2113
    @girijaprasadpatnaik2113 3 роки тому

    Love you brother 🌻🌻🌻

  • @vickyrajray2952
    @vickyrajray2952 Рік тому +1

    THANKS MANNNNNN

  • @akashpal3415
    @akashpal3415 5 років тому

    Very good ML content,people are giving so much money in ML course not looking in this content

  • @devr4j
    @devr4j 6 місяців тому

    Love From IIT Dholakpur Sir

  • @manishn2442
    @manishn2442 2 роки тому

    sir thank you very much, you explain very well.

  • @MrDeepak8866
    @MrDeepak8866 5 років тому +1

    why i didn't watched this before , very helpful .thankyou

  • @gayathri5216
    @gayathri5216 4 роки тому

    Thank you so much sir...very useful video sir...😊

  • @nishiraju6359
    @nishiraju6359 4 роки тому

    Nice content its really helps me alot ..... Request you to keep uploading videos .. more n more .. Once again thank you so much

  • @pratikpande5917
    @pratikpande5917 5 років тому +6

    Thank you Sir, videos from this series helped me get a clear understanding about the concepts. Keep making such videos , they sure help a lot.

  • @biswadeepdas8757
    @biswadeepdas8757 3 місяці тому

    dimag ki batti galat jal gai was epic sir

  • @sumeetkaur902
    @sumeetkaur902 3 роки тому

    Excellent Explanation, Thank U sir

  • @5MinutesEngineering
    @5MinutesEngineering  5 років тому +8

    There's a small silly mistake it's "Principal" Not "Principle"

  • @madhushreearun1089
    @madhushreearun1089 5 років тому +2

    Best and simplest possible explanation.

  • @jeniajeba7230
    @jeniajeba7230 5 років тому +2

    very easily explained and easy to understand , amazing ! keep up the good work :)

  • @datasciencewithshreyas1806
    @datasciencewithshreyas1806 4 роки тому

    superb video

  • @nalisharathod6098
    @nalisharathod6098 4 роки тому +3

    Please do a video in SVD (Singular Value Decomposition) . I really love your videos very useful . Thank you soo much

  • @thedeepakmor
    @thedeepakmor 4 роки тому

    thanx brthr it was very helpful you gained a subscriber

  • @naeemchaudry733
    @naeemchaudry733 3 роки тому +3

    sir you deserve the nobel prize. your way of explaining is so amazing.

  • @mohammedrehman4109
    @mohammedrehman4109 4 роки тому

    Very Nice and precise explanation. You did a lot of home work on PCA in making precise. Thank you

  • @AnuragSingh-vv3qv
    @AnuragSingh-vv3qv 4 роки тому

    awesome i loved it...

  • @parthprajapati3487
    @parthprajapati3487 4 роки тому

    It was nice and simple explaination

  • @pardeep.ksharma9136
    @pardeep.ksharma9136 4 роки тому

    Excellent

  • @deepsant2372
    @deepsant2372 5 років тому +2

    Thank you so much sir,....you are great.😍

  • @saiful_not_found
    @saiful_not_found 5 років тому +4

    sabko pass karayega apna phaizal 5 minute mai

  • @nikp9999
    @nikp9999 4 роки тому

    sahi bataya bhaiya

  • @hayatt143
    @hayatt143 4 роки тому

    Great Video. Few points needs clarification like?
    1. Why the only PC1 will b considered that means always 1 view is considered.
    2. What is the view exactly?
    3. How being orthogonal make them different? (Is it orthogonal properties ?)
    4. (Most impt) Just by having different view how the features are reduced. arn't we still putting all the features to training? (This explanation was abstract. A bit technical would have done wonders.

  • @mukeshsirvi6378
    @mukeshsirvi6378 4 роки тому

    mja aa gya itna to mene thin salo me nhi sikha

  • @cybershrajal
    @cybershrajal 17 днів тому

    Sirrrrr firr aa gya mai .

  • @mohammadnafees9704
    @mohammadnafees9704 5 років тому +1

    great explanation

  • @l2mbenop346
    @l2mbenop346 5 років тому +1

    Superb Explanation !

  • @anaya1012
    @anaya1012 3 роки тому

    Thank you so much sir 🙏

  • @weekendvibes468
    @weekendvibes468 3 роки тому

    I understood everything thank you so much 👌👌

  • @animationcrust1993
    @animationcrust1993 4 роки тому +1

    Thank you sir ☺️🙏

  • @ShalabhBhatnagar-vn4he
    @ShalabhBhatnagar-vn4he 4 роки тому

    Awesome work!

  • @poojamankar
    @poojamankar 5 років тому

    Thanks for sharing.... Good efforts

  • @ankitaray1405
    @ankitaray1405 2 роки тому

    Hum e aur pass kar neh ki chahat na hoti, aagar tum na hote, aagar tum na hote 🙏🙏🙏🙏

  • @preetisengar5900
    @preetisengar5900 3 роки тому

    Best and simplest explanation ....can u suggest any book or PDF of pca

  • @tapanjeetroy8266
    @tapanjeetroy8266 5 років тому

    Thank you sir.. You are doing a great job

  • @k_anu7
    @k_anu7 2 роки тому

    Andrew Ng - Do not apply pca to reduce overfitting,
    Le 5 min engineering - hold my opening statement

  • @azmatsiddique3564
    @azmatsiddique3564 5 років тому +1

    ❤️thank you sir..great explanation

  • @GhanshyamAbrol
    @GhanshyamAbrol 4 роки тому

    Thanks also from Agricultural side

  • @shrutipanchal5203
    @shrutipanchal5203 4 роки тому

    Easy explanation