13.4.3 Feature Permutation Importance Code Examples (L13: Feature Selection)

Поділитися
Вставка
  • Опубліковано 7 лис 2024

КОМЕНТАРІ • 13

  • @lucarubinetti2523
    @lucarubinetti2523 Рік тому +5

    Can you do some video about Shapley values for feature importance? Thanks a lot :)

  • @liss9597
    @liss9597 Рік тому

    Very thankful for this video and the entire set of videos. In min 7:08 X_test and y_test must be numpy array, right? If yes, should I use X_test.values and y_test.values or X_test.to_numpy() and y_test.to_numpy() ?? Thanks again!

  • @khaledsrrr
    @khaledsrrr Рік тому

    Keep them coming ❤❤❤
    I liked it

  • @sahanaseetharaman1440
    @sahanaseetharaman1440 2 роки тому

    Thanks for the video! In this case, two of the features are perfectly correlated. What if the correlation is less than |1|? Also, what happens in the case of categorical features? Suppose there is a feature column with multiple categorical features, and we one-hot encode it, does it make sense to sum their feature importances to get the importance of that feature?

  • @bezawitdagne5756
    @bezawitdagne5756 Рік тому +1

    Thank you so very much 💙🙏

  • @andrewm4894
    @andrewm4894 8 місяців тому +1

    Great video really useful explanations

  • @monashaaban2337
    @monashaaban2337 Рік тому +1

    hi Sebastian Raschka, can you explain LDA with code please?

    • @SebastianRaschka
      @SebastianRaschka  Рік тому +1

      Coincidentally, I wrote about it here a few years back: sebastianraschka.com/Articles/2014_python_lda.html

    • @monashaaban2337
      @monashaaban2337 Рік тому

      @@SebastianRaschka thank you.

  • @ocamlmail
    @ocamlmail 2 роки тому

    Very useful, thank you!

  • @mariajuanasadventures3672
    @mariajuanasadventures3672 2 роки тому +1

    Thanks for the video