Feature Selection Techniques Explained with Examples in Hindi ll Machine Learning Course

Поділитися
Вставка
  • Опубліковано 28 лис 2024

КОМЕНТАРІ •

  • @heenarupabheda6021
    @heenarupabheda6021 4 роки тому +23

    That line "Aaj ka video bahut hi kamal ka hone wala hai"..😄
    Sir aapka har video kamal ka hota hai..😀

  • @tanvisshah26
    @tanvisshah26 3 роки тому +139

    Watched you when I did my Bachelor's, watching you now when I'm doing my Master's!

  • @bhavikdudhrejiya852
    @bhavikdudhrejiya852 3 роки тому +68

    This are the comprehensive list of various feature selection
    1. Filter Methods
    A. Basic Filter Method
    1. Constant Features
    2. Quasi Constant Features
    3. Duplicate Features
    B. Correlation Filter Methods
    1. Pearson Correlation Coefficient
    2. Spearman's Rank Corr Coef
    3. Kendall's Rank Corr Coef
    C. Statistical & Ranking Filter Methods
    1. Mutual Information
    2. Chi Square Score
    3. ANOVA Univariate
    4. Univariate ROC-AUC / RMSE
    ------------------------------------------------------------------------
    2. Wrapper Methods
    A. Search Methods
    1. Forward Feature Selection
    2. Backward Feature Elimination
    3. Exhaustive Feature Selection
    B. Sequential Floating
    1. Step Floating Forward Selection
    2. Step Floating Backward Selection
    C. Other Search
    1. Bidirectional Search
    ------------------------------------------------------------------------
    3. Embedded Methods
    A. Regularization
    1. LASSO
    2. Ridge
    3. Elastic Nets
    B. Tree Based Importance
    1. Feature Importance
    ------------------------------------------------------------------------
    4. Hybrid Method
    A. Filter & Wrapper Methods
    B. Embedded & Wrapper Methods
    1. Recursive Feature Elimination
    2. Recursive Feature Addition
    ------------------------------------------------------------------------
    5. Advanced Methods
    A. Dimensionality Reduction
    1. PCA
    2. LDA
    B. Heuristic Search Algorithms
    1. Genetic Algorithm
    C. Feature Importance
    1. Permutation Importance
    D. Deep Learning
    1. Autoencoders
    ------------------------------------------------------------------------

    • @ravishankar2180
      @ravishankar2180 3 роки тому +7

      main topic : Dimesionality Analysis
      type : 1. feature selection 2. feature extraction
      1 - 4 : feature selection (here we just eliminate the features based on analysis)
      5 : feature extraction (here we combine two or more features )

    • @manasmore8893
      @manasmore8893 Рік тому

      prime example of over-fitting

  • @humancodex
    @humancodex 5 років тому +12

    Vai Real engineer ho, salute.

  • @akshayjadhav3004
    @akshayjadhav3004 2 роки тому +3

    Got my B.E. Result Today with Distinction.. Thank you so much sirjii for such smooth Teaching..😍

  • @41abhishek
    @41abhishek 5 років тому +8

    Excellent tutorial.
    But, regarding embedded method.. (as per my understanding) the algorithm itself filter the unimportant feature. The best example is regularization.
    Ridge and Lasso regularization in liner regression remove or vanish the unimportant feature coefficient (As their coefficient is already low and after applying regularization it will become zero).

  • @bhavikdudhrejiya4478
    @bhavikdudhrejiya4478 5 років тому +3

    Nice way to explained.
    Learning points:
    1. What is feature selection?
    2. Why We require feature selection?
    3. Why this model has low efficiency?
    4. Optimal selection of the feature
    5. Techniques of Feature Selection
    a. Filter Methods: 1.IG 2. Chi-Square Test 3. Correlation Coefficient
    b. Wrapper Methods: 1. Recursive Feature Elimination 2. Genetic Algorithm
    c. Embedded Methods: Decision Trees
    6. General Version of Filter Methods
    7. General Version of Wrapper Methods and Embedded Method
    8. What is wrapping?
    9. Generate multiple models with a different subset of features
    10.Difference Between Wrapper Methods and Embedded methods
    11. Advantage and Disadvantages

  • @sadaqathussain1310
    @sadaqathussain1310 2 роки тому

    Waah.. Kamal Krdia Sir g, behtreen. Is se se asan koi tariqa shayd koi nhe hoga beginners ko smjhany ka. Thankyou

  • @randomperson7009
    @randomperson7009 2 роки тому

    Only 4 words: You are the BEST.

  • @sam9620
    @sam9620 4 роки тому

    The best channel i have found so far for my data mining course. 100/100

  • @alirauf9022
    @alirauf9022 5 років тому +1

    Aik dam baraber bhaiyya, Aik dam baraber.

  • @rahulchadar9096
    @rahulchadar9096 5 років тому +2

    Sir ji itne dino se kaha the aap ab to Engineering bhi khatam hone wali h ,,, pahle hi mil jate 👍👍

  • @tanveergulzar4715
    @tanveergulzar4715 2 роки тому

    I think out of 3.80 lakh subscribers,3.70 lakh subscribers are the ones who study one day before the exam😂😂😂.You are a genius.Thank you .

  • @Experts_Experience
    @Experts_Experience 2 роки тому +4

    studying Machine learning from rohit sharma

  • @mehulsoni2300
    @mehulsoni2300 4 роки тому

    jabardast bhai...thanks to teach in interacive way....kamaaal ka enthusiasm he apka

  • @WondersOfWandering
    @WondersOfWandering 3 роки тому +2

    Such interesting videos on topic which i was finding difficult to understand and boring earlier. Now, able to understand it in just a span of 5-10 minutes in the most easy and interesting manner.
    Thank you so much!!!

  • @mohammadsaif3677
    @mohammadsaif3677 4 роки тому +1

    Your explanation delivery is too good... people connect with u ... Good stuff mate.

  • @shrunkhalshahane3323
    @shrunkhalshahane3323 6 місяців тому

    00:01 Feature selection techniques are crucial for attribute selection.
    01:35 Feature selection techniques are essential for optimizing machine learning models.
    03:15 Feature dependency and correlation
    04:52 Correlation between attributes and the target variable is important for feature selection
    06:27 Feature selection techniques include recursive feature elimination and genetic algorithm.
    08:03 Feature selection helps in generating multiple models with different feature subsets.
    09:48 Feature selection is important for machine learning model building.
    11:29 Feature selection techniques help in reducing computational expenses and avoiding overfitting.
    Crafted by Merlin AI.

  • @Admin_REX
    @Admin_REX Рік тому +1

    aree sir ji thanks i will comment after todays paper >>>>>>>>>

  • @devotion_surya3741
    @devotion_surya3741 3 роки тому

    Watching this video before exam , its very much helpful

  • @Theiteducation2015
    @Theiteducation2015 2 роки тому

    Excellent.. one.. this is first video ... i saw.. and it 100% give me understandings...

  • @shreyashpatil2418
    @shreyashpatil2418 2 місяці тому +2

    Watching lecture before exact 13 min of exam😅😅

  • @jeelpanchal6653
    @jeelpanchal6653 Рік тому +1

    Make a video on Feature Extraction Method with Examples

  • @codingtrack4237
    @codingtrack4237 3 роки тому

    Watching from Pakistan

  • @poojabijlani78
    @poojabijlani78 2 роки тому

    Sir your explanation giving deep learning of ML Thankuuuuuuu

  • @ppuppu1172
    @ppuppu1172 4 роки тому

    Your explanation is very easy to understand...

  • @mdriadulhasan5777
    @mdriadulhasan5777 2 роки тому +1

    Excellent tutorial

  • @abhijeetranmale8704
    @abhijeetranmale8704 5 років тому +2

    bhaiji please 10th march tak machine learning cover krlo...sirf numericals bhi chalenge

  • @santoshsahu-oy5vo
    @santoshsahu-oy5vo 9 місяців тому

    Really amazing dear...
    Thanks a lot for your dedication...
    Really it is appreciable!!!

  • @harib879
    @harib879 5 років тому +2

    Bhaiya me ek hi like kr sakta hu baki apke sab video k 100 likes bante h, obviously me mere groups me share kruga

  • @yeshuip
    @yeshuip 4 роки тому

    bhai you deserve more subscribers

  • @ninjawarrior_1602
    @ninjawarrior_1602 5 років тому +3

    Please let me know can we use any of these techniques in an unsupervised learning Clustering problem where there is no target variable

  • @versatilenick2209
    @versatilenick2209 6 місяців тому

    dude you have make it so interesting hats off

  • @MrAyandebnath
    @MrAyandebnath 5 років тому +6

    Very nice explanation..short and compact..i love the way u make us understand...I am so happy after watching your video that
    I subscribed your channel to learn more from you

  • @hemantkumardas3333
    @hemantkumardas3333 Рік тому

    Very Nicely Explaining Sir...

  • @puspitachatterjee4595
    @puspitachatterjee4595 4 роки тому +1

    Bhaiya, aap GridSearchCV..... confusion matrix ke upar kuch video banake dijiye please... I am your subscriber.

  • @akashprabhakar6353
    @akashprabhakar6353 4 роки тому

    Thank you sir....your way of teaching is very lucid ....

  • @aartitatiya8275
    @aartitatiya8275 5 років тому +5

    Sir plz upload the video of 2-3 unit of machine Learning.... exam he sir plzzz

  • @gulfrazahmed684
    @gulfrazahmed684 3 роки тому

    very good way for understanding a topic

  • @muktadhopeshwarkar5687
    @muktadhopeshwarkar5687 3 роки тому

    Very nice explanation.. In a very easy manner..

  • @aishwarya8078
    @aishwarya8078 2 роки тому

    What an explanation... Hats off

  • @maxpayne880
    @maxpayne880 5 років тому +6

    Sir just once do AES and DES encryption

  • @nikitaprajapati233
    @nikitaprajapati233 4 роки тому

    Ple explain this topic :
    Matlab method
    Neural network toolbox and fuzzy logic toolbox
    Unsupervised learning neural network
    Simple implementation. Of artificial neural network and fuzzy logic

  • @priyankab5527
    @priyankab5527 5 років тому +2

    Please upload video on Data scaling and Normalization.

  • @shwetakaul3758
    @shwetakaul3758 Рік тому

    But decision tree is a classic example of overfitting model. So how can you say that embedded is better wrapper method in terms of overfitting?

  • @rohitthareja1019
    @rohitthareja1019 5 років тому +3

    Awesome explanation!

  • @RishikeshGangaDarshan
    @RishikeshGangaDarshan 4 роки тому

    PCA is use for dimension reduction so why we use other techniques for future selection. Please clear my doubt

  • @anilgadekar848
    @anilgadekar848 4 роки тому +6

    Great explain sir 😃😄,
    Finally got good resource to me for learne ML in simple and easy way 😊

  • @praveenpanikar6415
    @praveenpanikar6415 5 років тому +8

    Sir Thanks a lot for your help.. i have watched, shared and liked every video.. :)
    please upload more videos of Machine Learning...

  • @amirabbas4720
    @amirabbas4720 3 роки тому

    Sir pls make videos, fully in English so that others who don't know hindi also make use of u r amazing videos

    • @5MinutesEngineering
      @5MinutesEngineering  3 роки тому

      Yes, Now you can find my videos in english as well, Only on 5 Minutes Engineering English UA-cam Channel. This is a new youtube channel and I am trying my best to provide Computer Science topics in english but It may take some time to cover all CS topics.

    • @amirabbas4720
      @amirabbas4720 3 роки тому

      @@5MinutesEngineering thank you sir , 👍

    • @Sunilkumar-vf3zp
      @Sunilkumar-vf3zp 2 роки тому

      @@5MinutesEngineering please provide the machine learning notes

  • @aniketkumar967
    @aniketkumar967 2 роки тому

    sir you are an amazing teacher. Hats off you sir🧡

  • @Rana2058
    @Rana2058 5 років тому +3

    awesome Dear....

  • @sagarmandal4990
    @sagarmandal4990 9 місяців тому

    I don't think I have ever encountered a teacher in my life

  • @AMANVERMA-bq8hj
    @AMANVERMA-bq8hj Рік тому

    Nicely explained.Thanks a lot sir !

  • @angadkadam2844
    @angadkadam2844 4 роки тому

    Sir Nyce explaination...But recersive feature,does that take reverse also...for eg SAY ABC THEN AB,AC,AD...BUT WILL IT TAKE REV ALSO LIKE IF AB THEN BA ALSO,IF AC THEN CA ALSO,IF DA THEN DA ALSO AND SO ON..OR TAKE 3 LIKE ABC THEN ACB,BAC,BCA,CAB,CBA AND SON ON DEPENDING ON THE ROW LENGTH...PLS ANSWER ASAP

  • @zahidjaan1319
    @zahidjaan1319 2 роки тому

    Thank U Engr. Bhai !

  • @sonu-mb3nh
    @sonu-mb3nh 11 місяців тому

    Best explanation sir... Great 🎉❤

  • @yashbastawad2331
    @yashbastawad2331 2 роки тому

    Superb excellent 👍

  • @ujjalroy1442
    @ujjalroy1442 Рік тому

    Fabulous explainers....

  • @dineshjoshi4100
    @dineshjoshi4100 2 роки тому

    Hello, Thanks for the explanation. I have one question. My question is, Does using best features helps to reduce the training data sets. Say I do not have a large datasets, but I can make independent variable that is highly corelated with the dependent variable, will it help me reduce my traning data sets. Your response will be highly valuable.

  • @RMs414
    @RMs414 Рік тому

    Dear need video about Feature selection methods using pyspark. kindly make it.

  • @Zeeshan18382
    @Zeeshan18382 Рік тому

    very very nice information for us thx allot brother

  • @ASh-hb1ub
    @ASh-hb1ub 4 роки тому

    Very informative lecture.thank you very much sir👏👏👏💐💐👌👌👌👌👌👌👌👌👌👌🌹🌹🌹🌹🌹🌹🌹🌹 🌹

  • @TravelDiaries254
    @TravelDiaries254 5 років тому +1

    Please upload the video of isotonic regression

  • @aparnatiwari6442
    @aparnatiwari6442 Місяць тому

    Thank you Sir

  • @bhanuprataprai6531
    @bhanuprataprai6531 3 роки тому

    ultimate bhai.very nice explanation, n method to teach

  • @vaghii6731
    @vaghii6731 3 роки тому

    Superb teaching

  • @ansarbashashaik4345
    @ansarbashashaik4345 4 роки тому

    Thank you sir thanks a lot you helped lot of people like me thank you very much

  • @veerabehal151
    @veerabehal151 5 років тому +1

    sir chi square nd Ig jo yha hei aap bol re teach kiya plz link share krdo

  • @shakifrauf2204
    @shakifrauf2204 5 років тому

    Sir Fantastic....Sir aap please python bhi lo sir...

  • @nj9493
    @nj9493 4 роки тому

    what is the difference between PCA and Feature selection?
    because both serve the purpose of selecting useful information or getting rid off the overfitting problem, isn't it?

    • @krishnavgarg
      @krishnavgarg 3 роки тому

      i think in pca u dont remove the features , you simply project them into lesser no of dimensions while maintaining 85% variance atleast.

  • @college_lyf22
    @college_lyf22 3 роки тому

    In which case we shud use filter methods...

  • @WaleedAbdullahkEE
    @WaleedAbdullahkEE 2 роки тому

    U give best concept, but explained with numerical problem so that concept applied

  • @tantranathjha3397
    @tantranathjha3397 2 місяці тому

    very useful

  • @ssdheeraj6347
    @ssdheeraj6347 5 років тому +1

    Wow good explanation

  • @shashankraj6662
    @shashankraj6662 4 роки тому

    Is feature selection problem correlated with regression...?

  • @lakshayjeetswami489
    @lakshayjeetswami489 4 роки тому +1

    sir your videos r really good... i get the best results to the topics here.. but I want to request more videos... there r a lot more topics in ML which u haven't completed... so just help me there... i am from RTU kota. my university have some unexpected works on this course... i mean the topics r not sequential and all.. in some bad way only.. help me please...

  • @SandeepSharma-md2eb
    @SandeepSharma-md2eb 4 роки тому

    please arrange playlist video in some sequene..

  • @shakifrauf2204
    @shakifrauf2204 5 років тому

    Sir Aap excellent ho. Sir aap python pay machine learning sikhaye please please

  • @umangsaluja8034
    @umangsaluja8034 2 роки тому

    Well explained!! Please make some videos for hands on practice using tools.

  • @sandykumar5350
    @sandykumar5350 5 років тому +1

    Awsome ..Thank you!!!

  • @hassantufik5750
    @hassantufik5750 4 роки тому

    vai, so good you are.......

  • @susantkumarpal6418
    @susantkumarpal6418 Рік тому

    Sir, kindly produce a video on hypothesis space and inductive bias .

  • @ankeshbobade8240
    @ankeshbobade8240 3 роки тому

    well explained sir

  • @kundankaushik3006
    @kundankaushik3006 4 роки тому

    sir what is Hybrid filter-wrapper feature selection .... please espe v ek bana do video

  • @shrutishelke7752
    @shrutishelke7752 5 років тому +2

    Sir,Plz upload ur videos on OPEN ELECTIVE subject BUSINESS INTELLIGENCE

  • @TheArchit1
    @TheArchit1 5 років тому +4

    Nice

  • @asifbeigmogal5960
    @asifbeigmogal5960 3 роки тому

    Bhai humko tho subject nahi hyy...Exam ke purspose 😋😋😋...just for gaining knowledge....

  • @nagamalleswarikupparthi4788
    @nagamalleswarikupparthi4788 2 роки тому

    Sir ur teaching style is very good ...but can u please teach the content in English so a normal people can also understand 😊

  • @sujoychatterjee6950
    @sujoychatterjee6950 4 роки тому

    Sir in practical i have 80000 fratures for each image. Say Such 500 images are there. So the dimension is 80000*500. Now i have to apply pca for dimensionality reduction. But nowhere i find the technique. Please help me.

  • @richakushwaha1759
    @richakushwaha1759 Рік тому

    Sir make vedios on nlp plz we are in need of it

  • @learndaily964
    @learndaily964 Рік тому

    Thanks a lot sir❤❤

  • @george4746
    @george4746 4 роки тому

    Your videos are fabulous short and to the point. Can you tell me the book which you're following?

  • @anaswahid8520
    @anaswahid8520 5 років тому

    Sir variable selection methods multiple regression me Jo h us par video banwaye
    I.e forward, backward and stepwise selection method in multiple linear regression Jo h

  • @mutiullahjamil
    @mutiullahjamil 4 роки тому

    Nice video. If you want to get more details then you can visit CSForum for image processing.

  • @shazabbashir6178
    @shazabbashir6178 4 роки тому

    plz upload "PARTIAL LEAST SQ [PLS] METHOD- Explained with a numerical example

  • @pankajkumarsingh8337
    @pankajkumarsingh8337 Рік тому

    sir please Machine Learnig playlist sort your playlist beacause it create confusion many videos are there in different different places

  • @user-rx5kq6oo9y
    @user-rx5kq6oo9y 5 років тому

    Sir but i do bigmart sales prediction model
    In this model if i take only revelant attribute and if i take all the attribute then i found more number number of attribute gives more accurate prediction .please help me out

  • @muhammadasim551
    @muhammadasim551 2 роки тому

    Good job

  • @abhishekgour103
    @abhishekgour103 3 місяці тому

    sir target attributes ka pata kaise lagyenge kon sa target attribute hai