Kernel Density Estimation : Data Science Concepts

Поділитися
Вставка
  • Опубліковано 5 січ 2025

КОМЕНТАРІ •

  • @kolepugh9186
    @kolepugh9186 11 місяців тому +23

    As a senior data science student, I want to enter the job market with as much knowledge as possible. Easy-to-follow videos like this make that goal so much easier. Thank you!

  • @mustafizurrahman5699
    @mustafizurrahman5699 10 місяців тому +3

    Enthralling video on this topic. I cannot thank you more for the lucid explanation on this interacted topic.

  • @pipertripp
    @pipertripp 11 місяців тому +2

    Sublime. This topic just came up in a data analytics course I'm taking (it wasn't a central theme of the lesson, but I hate not knowing the details sometimes) and this programme is a perfect complement to that. Like others have said, your style is intuitive but not over simplified. In general, I feel like you're striking a great balance between ease of understanding and mathematical rigour.

  • @Frijjazzo
    @Frijjazzo 10 місяців тому +2

    Amazing video, so clear and concise. I learn better with visual and conceptual ideas first before diving into the maths. Thank you!

    • @ritvikmath
      @ritvikmath  10 місяців тому

      Glad it was helpful!

  • @shu5011
    @shu5011 11 місяців тому +2

    Love the content. Easy to follow and understand. You are one of the best teachers in the data science field!

  • @jamescanada2460
    @jamescanada2460 2 місяці тому

    Such a great lesson! Lately I've been very frustrated with the unintuitive and bloated language of my university lectures and texts. Thank you!

  • @malihatunnesa3972
    @malihatunnesa3972 2 дні тому

    this man is a magician!

  • @margaritakhachatryan
    @margaritakhachatryan 4 місяці тому

    10 times better than any materials i had from uni, and now i actually get it!!

  • @uncaged3076
    @uncaged3076 2 місяці тому

    Thank you for this video. Way way better teaching than what I am getting in university

  • @EricJ-f9m
    @EricJ-f9m 4 місяці тому

    Crystal clear! Appreciate your effort for making such amazing videos!

  • @hasnaabennis1248
    @hasnaabennis1248 11 місяців тому +1

    Amazing video! Clearly explained with an easy to understand example. Thank you

  • @iffatara8846
    @iffatara8846 7 місяців тому

    the only video i undestood without mathematical jargon.

  • @andrashorvath2411
    @andrashorvath2411 9 місяців тому

    Very clear flow of explanation, thank you. I'm thinking that it would be useful to design a hypothesis test for the chosen setup to back up the idea of the final density and so to get an extra information along with the vertical position of the chosen point as of how much proof we have for the final result that is allowed by the number and positions of the known fixed points. More research would be nice.

  • @HemanthKumar-vl9oh
    @HemanthKumar-vl9oh 11 місяців тому +3

    Very good and intuitive explanation

  • @petegranneman1623
    @petegranneman1623 7 місяців тому

    Great explanation! Gaussian KDE is great for bimodal and skewed distributions. One downside with gaussian KDE is difficulty accurately modeling distributions with high excess kurtosis.

  • @jamesagresto4049
    @jamesagresto4049 2 місяці тому

    Big fan of the fish drawings :)

  • @ЮхновськийНазарій
    @ЮхновськийНазарій Місяць тому

    thank you very much, it has become so clear.

  • @faustovrz
    @faustovrz 10 місяців тому

    Clear explanation and easy to follow, thank you! Silly observation: "Integrate over all possible weights of fish. All the way from negative infinity to positive infinity": I'm no ichthyologist or fisherman but I feel negative weight fish ain't an option.

    • @pranavchandrav6071
      @pranavchandrav6071 7 місяців тому

      Negative infinity to positive infinity just means that you've to integrate the PDF over its domain :)

  • @perkyfever
    @perkyfever 11 місяців тому

    Quality content here. Also examples are nice and clear!

  • @mandyguo4020
    @mandyguo4020 4 місяці тому

    Always the best!!

  • @franfurey
    @franfurey 10 місяців тому

    Love it, amazing work in this video, congratS!

  • @iaaan1245
    @iaaan1245 2 місяці тому

    Banger video as usual

  • @isoljator
    @isoljator 3 місяці тому

    Excellent video, subscribed!

  • @dr_greg_mouse4125
    @dr_greg_mouse4125 8 місяців тому +1

    Really nice explanation. Thanks a lot.

  • @edgarromeroherrera2886
    @edgarromeroherrera2886 Місяць тому

    so useful. Thank you so much man

  • @niklasbjorkenheim1479
    @niklasbjorkenheim1479 9 місяців тому +1

    Thank you, Great Video:)

  • @faisalhussain1045
    @faisalhussain1045 3 місяці тому

    Just one silly question pl. Which tool did you use to plot the graphs at 15:20 ?

  • @FlemingRound
    @FlemingRound 4 місяці тому

    Very nice!

  • @VarunMalik-mo6mr
    @VarunMalik-mo6mr 3 місяці тому

    You are best❤

  • @johnhausmann2391
    @johnhausmann2391 2 дні тому

    You say that Kh is normal centered at Xi, but the way you've set it up, it looks like Kh will be centered at 0 (i.e., when x = xi, density is max).

  • @emre-erdin
    @emre-erdin 4 місяці тому

    Thank you for this amazing video! But I have a question. At the beginning, the question was defined as "What is Population Density". But, does not KDE give us the density of a spesific data point instead of the whole population as estimated? Because the result is found as using a data point which does not appear in the results. Therefore, we actually try to understand the density of a spesific point instead of population. Do I get it wrong or was the question generalized?

  • @luciapalacios7819
    @luciapalacios7819 11 місяців тому

    Amazing video thanks!!!!

  • @orastem
    @orastem 3 місяці тому

    Would it be fair to say that this method is applicable mostly when the amount of data is relatively low? With large amount of data you'd just plot a histogram and be done? What sort of data do you visualise with KDE?

  • @eramy1
    @eramy1 10 місяців тому

    Thanks for the good explanation about KDE method. could you please make a video about prediction intervals PI that sometimes uses the KDE method?
    thanks!

  • @njabulonzimande2893
    @njabulonzimande2893 3 місяці тому

    Part of non parametric regression for postgraduate statistics

  • @winstongraves8321
    @winstongraves8321 11 місяців тому

    Great video

  • @nilkantgudpale1959
    @nilkantgudpale1959 10 місяців тому

    loved the way teach

  • @ovren4897
    @ovren4897 7 місяців тому

    great video but i am confused about why we didn't use just 1/n*(sigma(...)) for MISE formula but integral and expected value.

    • @deltamico
      @deltamico 7 місяців тому

      You integrate cause you're working with continuous functions. It is already normalized since the squared difference could be at most 1. We also want a good estimsted distribution to perform well on other samples from the true distribution. That's why we take the expected error on various samples

  • @Baharehhashemi-df4cv
    @Baharehhashemi-df4cv 8 місяців тому

    thank you

  • @alihussien7935
    @alihussien7935 11 місяців тому +1

    Wow you are great can you make full Videos about ml using book An Introduction to Statistical Learning
    - with Applications in R?

    • @ritvikmath
      @ritvikmath  11 місяців тому +1

      Thanks! I’ll look into it

    • @alihussien7935
      @alihussien7935 11 місяців тому

      @@ritvikmath please doit you explain things Easy and simple, given the must information of things so it's very Easy for us to remember

  • @_noirja
    @_noirja 11 місяців тому +1

    very very good one pound fish

    • @mario1ua
      @mario1ua 11 місяців тому

      Come on ladies, come on ladies

  • @vallaugeri3152
    @vallaugeri3152 6 місяців тому

    So helpful, better than my professor lol

  • @ImTheCitizenInsane
    @ImTheCitizenInsane 4 місяці тому

    Great content, and very clearly explained. May I just suggest starting from "white sheet" or almost? it doesn't need to be written or drawn incredibly well but the full sheets feel pretty overwhelming

  • @louvasi7388
    @louvasi7388 3 місяці тому

    It would have been nice, if you had written down the math at 11:50.

  • @AdrianBoyko
    @AdrianBoyko Місяць тому

    Was expecting the terms to be “over fitted” and “under fitted” but they turned out to be “under smoothed” and “over smoothed”. So disappointed.