Forward Stepwise Selection Method

Поділитися
Вставка
  • Опубліковано 6 жов 2024

КОМЕНТАРІ • 22

  • @LexieLin-fd5xp
    @LexieLin-fd5xp Рік тому

    nice explanation, get to totally understand forward&backward method finally.

  • @gulinasirova3257
    @gulinasirova3257 Рік тому

    very good explanation !

  • @saikishore1992
    @saikishore1992 5 років тому +1

    Really cool explanation. Please keep producing more videos. If I had some investment money, I would have put it right on your channel just to make it more fancier, content is superb!

    • @IQmates
      @IQmates  5 років тому +1

      Sai Kishore Subramaniam thank you for the positive feedback. I am back to recording now. About to finish the machine learning course and I’ll start on deep learning 😊

    • @Viewfrommassada
      @Viewfrommassada 4 роки тому

      I feel the same too. Great work

  • @lizzy1138
    @lizzy1138 3 роки тому +1

    Great video, but you should credit the book you took the method almost verbatim from : introduction to statistical learning, chapter 6, algorithm 6.2 - but again, really helpful, thank you !!! EDIT : IQmates actually credited several sources in an introduction video, my mistake.

    • @IQmates
      @IQmates  3 роки тому

      Hi Dr. Lizard. In my introduction video, I do credit the sources I use. Most structure came from ISLR and I talk of sources such as Towards Data Science blog, Analytics Vidhya, ELSR and so on. Thank you for the feedback though. I will edit the caption.

    • @lizzy1138
      @lizzy1138 3 роки тому

      @@IQmates My apologies, I had not seen the introduction video, I will edit my comment. Thank you for your reply

  • @erickkaviga6001
    @erickkaviga6001 3 роки тому

    Most helpfull things in my study🙏🏿🙏🏿

  • @vikasbhardwaj455
    @vikasbhardwaj455 5 років тому +1

    Great Explanation.

    • @IQmates
      @IQmates  5 років тому

      Thanks Vikas Bhardwaj. I’m glad it made sense. Stay tuned for more videos!

  • @wahyuwisnuwardana9839
    @wahyuwisnuwardana9839 5 років тому +1

    Good explanation! But, just wanna give a suggestion: why don't you provide less than 10 features only? Lets say 3 or for is enough. :)

    • @IQmates
      @IQmates  5 років тому

      Wahyu Wisnu Wardana thank you for the feedback. I will try use fewer features for the explanations. My aim was to show how the method can be difficult computationally as we have more features 😊

  • @MichaelMeighu
    @MichaelMeighu 4 роки тому

    This is an amazing site! Thank you pal. If you have a patreon account you should post it. The other thing - are you sure its B1X2 - or is it B2X2?

  • @sagaradoshi
    @sagaradoshi Рік тому

    Thanks for the wonderful explanation.. I have a doubt hope you get sometime to clarify. For example when we do RSS or R^2 to find the best model what should be the value of parameter(Beta) we need to substitute with? Normally we find the parameters value like that best fit our data and keep adjusting until our RSS is reduced. But in the above case where in each loop we check for many models (i.e., by adding additional argument) how is our Beta calculated?

    • @yusufansari9159
      @yusufansari9159 10 місяців тому

      First we create a model. This model gives estimate of parameter beta. Then we check the RSS(or R2). If RSS is low among other models that model is selected and hence its beta estimate is for that model is selected.

  • @morikawadivinda2586
    @morikawadivinda2586 5 років тому

    Its a good explain. But I still have confuse. When the process will stop?

    • @IQmates
      @IQmates  5 років тому

      The principle is there is no defined stopping criteria for this. You test all models and compare them to get the best one. For example, when you are doing k = 2, you create all models that have two variables and then you test them to get the best one of those models, then you do the same for k = 3 (all models with 3 variables). You keep doing that, getting the best model for each k. After you are done getting the best models for each k value, you compare those best models using AIC or something to get the best of the best. So there is no stopping criteria. You look at all possible models.

    • @Viewfrommassada
      @Viewfrommassada 4 роки тому +1

      I think it stops when the addition of any variable doesn't augment the model or the additional variables are insignificant

  • @muhammadfarhan-qe6gp
    @muhammadfarhan-qe6gp 5 років тому

    hi i m farhan i cannot understand your main step when you 3rd model

    • @IQmates
      @IQmates  5 років тому

      Which part exactly muhammad farhan ? Can you put the time stamp of where you are questioning.

  • @muhammadfarhan-qe6gp
    @muhammadfarhan-qe6gp 5 років тому

    plzz expiain it