Approximate Nearest Neighbors : Data Science Concepts

Поділитися
Вставка
  • Опубліковано 28 сер 2024

КОМЕНТАРІ • 59

  • @ScaredCrows4
    @ScaredCrows4 3 роки тому +18

    Wow you have no idea how much i needed this for my current work project. Thanks as always for a fantastic explanation

  • @vineethgudela2033
    @vineethgudela2033 9 місяців тому +1

    I have implemented ANN on my own after watching your video. Thanks for the great explanation ritvik

  • @csbanki
    @csbanki 2 роки тому +2

    This is perfect!
    I'm so sick of all these fancy literatury stuff from professors all over the world who can only communicate through differential equations. THIS is how it should be explained. Thank you good sir!

  • @jiayangcheng
    @jiayangcheng 3 місяці тому

    Thank you so much sir this explanation shows your exceptional ability to teach. So enlightening!

  • @Septumsempra8818
    @Septumsempra8818 3 роки тому +3

    Both formats are cool

  • @zivleibowitz9846
    @zivleibowitz9846 Рік тому

    OMG. I hope all my lecturers will explain that clearly and intuitively. Thankss

  • @aZnPriDe707
    @aZnPriDe707 2 роки тому +2

    Clear explanation and very resourceful!

  • @hannahnelson4569
    @hannahnelson4569 2 місяці тому

    This is brilliant! Thank you so much for showing us this method!

  • @ctRonIsaac
    @ctRonIsaac 2 роки тому

    The lesson was clear and paper can be easier for you to control and work with. So this is fine. Thank you for the lesson!

  • @rockapedra1130
    @rockapedra1130 Рік тому

    You have a very clear but not too wordy style. *SUBSCRIBED*

  • @PhilipMavrepis
    @PhilipMavrepis 3 роки тому +6

    Pretty good explanation but you never showed what happens if the number of K you are searching for is bigger than the number of points in the specific area.
    For example let's say you have a new point in R4 which has 3 points and you are searching for 4-NN for that point.
    Thank you again for this video, really liked it

    • @Han-ve8uh
      @Han-ve8uh Рік тому

      Doesn't answer your question directly, but in FAISS IVF index, if k is more than number of items in a cell, it returns -1 id for the extra required neighbors, solution is to increase default nprobe=1 to probe more cells.

  • @randall.chamberlain
    @randall.chamberlain Рік тому

    Mate you really know how to explain things. Thanks for your time and dedication.

  • @seyedalirezaabbasi
    @seyedalirezaabbasi 9 місяців тому

    This format is better. Thanx.

  • @siddharthvij9087
    @siddharthvij9087 Місяць тому

    Excellent Video

  • @Shkedias
    @Shkedias 4 дні тому

    Thank you😊

  • @Mci146
    @Mci146 Рік тому

    Thank you so much for the simple and clear explanation with examples!

  • @vinnythep00h
    @vinnythep00h 2 місяці тому

    Great explanation!

  • @Crimau12000
    @Crimau12000 Рік тому

    Thanks for sharing such a detaild and thorough explanation!

  • @monalover3758
    @monalover3758 9 місяців тому

    Very clear explanation! I think I got it in one pass! Pace is good. Thanks! (PS. the paper format is fine!)

  • @doronbenchayim8526
    @doronbenchayim8526 2 роки тому +2

    THIS WAS AMAZING!!!!!!!!!!!!!!!

  • @marcosricardooliveira3790
    @marcosricardooliveira3790 Рік тому

    I really like this format for this kind of explanation
    Like explainnig how a technique works
    very good vid, thanks

  • @RetropunkAI
    @RetropunkAI Рік тому

    best explanation ever. thank you

  • @jamemamjame
    @jamemamjame Рік тому

    thank you very much 🙏🏼

  • @mayapankhaj9124
    @mayapankhaj9124 5 місяців тому

    Thank you so much for a beautiful lesson. Reminded me of my elementary school days and how teachers used to teach back then.

  • @KeithGalli
    @KeithGalli 2 роки тому +1

    Thanks! Good vid :)

  • @oddtraveller
    @oddtraveller 2 роки тому

    Greatly explained

  • @alinajafi1528
    @alinajafi1528 2 роки тому

    Perfect explanation! Thanks :D

  • @nadiakacem24
    @nadiakacem24 Рік тому

    thank you very much, it was so helpful

  • @dinuthomas4531
    @dinuthomas4531 Рік тому

    Very well explained!!

  • @brockobama257
    @brockobama257 6 місяців тому

    3:56 I thought that a kdtree can search nearest neighbor in logn and delete or add a point in logn so k nearest neighbors could be considered klogn which is less than n

  • @zahrashekarchi6139
    @zahrashekarchi6139 Рік тому

    well explained! thanks!

  • @raunaquepatra3966
    @raunaquepatra3966 3 роки тому +9

    "Lowest Complexity for Knn is O(n)" is not True!!
    Using kd-tree the complexity becomes
    O(log n).

  • @haneulkim4902
    @haneulkim4902 Рік тому

    Thanks for a great video! One questions, @9:23 new point we check if given point is below or above the blue line. The way you recognize whether point is above or below is by calculating distance between (point, 1) and (point, 9) ?

  • @loupax
    @loupax Рік тому

    I now wonder if this is a sensible algorithm for collision detection

  • @sarmale-cu-mamaliga
    @sarmale-cu-mamaliga 2 роки тому

    Really cool :O thank you

  • @RishiRajKoul
    @RishiRajKoul 9 місяців тому

    How would we determine that a point is above and below a line using code ?

  • @borknagarpopinga4089
    @borknagarpopinga4089 3 роки тому

    What's your qualification? Somehow I cannot find any information about your education etc. Awesome videos by the way, a lot easier to understand than what every professor tries to explain.

  • @kanchansarkar7706
    @kanchansarkar7706 2 роки тому

    such a great explanation! Wonder do you also have a similar video for HNSW? Thanks!

  • @stevengusenius7333
    @stevengusenius7333 2 роки тому

    Great video as always Ritvik.
    Am I correct that building the tree is an O(N) operation? That is, if I have only one new data point and haven't yet constructed the tree, will this still save any time over the exhaustive method?
    If not, then I presume building a forest would imply some break even point.
    Thanks.

  • @qwerty22488
    @qwerty22488 2 роки тому

    Thanks for this excellent video! Is there a poplar library that helps to experiment with ANN on local machine for a small set of data?

  • @ScottSummerill
    @ScottSummerill 3 роки тому

    I like it MUCH better. I found it sometimes overwhelming to be confronted with all the info and not yet have an explanation.

  • @rockapedra1130
    @rockapedra1130 Рік тому

    Paper is better, I think. Moving the papers around is like zooming without moving the camera.

  • @JayRodge
    @JayRodge Рік тому

    I still don't understand how do you classify the new point? region wise or is there any other method?

    • @monalover3758
      @monalover3758 9 місяців тому

      Here is what I think: each region has two points. So use a metrics (e.g. distance) from this given new point to the begin and to the end point and go with the closer one. The closeness can be Euclidean distance, or Cosine distance, or some other metrices.

  • @djangoworldwide7925
    @djangoworldwide7925 Рік тому

    Such a nice recursive challenge. anyone have an idea how to define a function to recursivley solve this kind of algorithm, given a creiteria of maximum points?

  • @atwork22
    @atwork22 5 місяців тому

    Looks like a sort of a binary search

  • @X_platform
    @X_platform 3 роки тому

    Is ANNOY using Voronoi ?