Hierarchical Agglomerative Clustering [HAC - Single Link]

Поділитися
Вставка
  • Опубліковано 3 жов 2024
  • Data Warehouse and Mining
    For more: www.anuradhabha...

КОМЕНТАРІ • 252

  • @shirleywaller7753
    @shirleywaller7753 3 роки тому +31

    THANK YOU SOOOOO MUCH!!!!! After emailing my professor, and getting no real help, and spending days trying to figure it out on my own, you saved me! You were very clear, explained very well, and I now was able to complete my assignment. I can't thank you enough for this invaluable help! I wish professors and teachers were all this good!

    • @durrotunnashihin5480
      @durrotunnashihin5480 3 роки тому +2

      me too.. the video saved me for the last minutes before my exam, thank you so much

  • @tusharkumar3828
    @tusharkumar3828 5 років тому +75

    Exactly! This is what I was looking for. The Manual calculation and formation of the dendrogram. Thank you ma'am.

    • @sabertooth21
      @sabertooth21 2 роки тому

      u must watch mahesh huddar video for better exlanation buddy

  • @durrotunnashihin5480
    @durrotunnashihin5480 3 роки тому +15

    The 3 hours before my exam I saw this video, fortunately single and complete linkage questions appeared on the last part of my machine learning's exam. Thank you very much to this video to get me have a better understanding :D

    • @tushar3549
      @tushar3549 2 роки тому

      Thank you on behalf of her

  • @adrianacorrea5634
    @adrianacorrea5634 5 років тому +9

    Thank you so much! You are a star! I have just started MSc after having graduated 20y ago. I am completely lost and you have been a great help.

  • @IlliaDubrovin
    @IlliaDubrovin Рік тому

    The knowledge you are sharing is so rare to find!! You save the world!!!

  • @PokemonClassicMaster
    @PokemonClassicMaster 2 роки тому +1

    Simple and to the point. The best explanation on youtube.

  • @danielgetty7216
    @danielgetty7216 5 місяців тому

    I've been trying to figure this out for a couple days for class. Once you showed the MIN(dist) it all clicked. Thank you!

  • @waleedbinowais7624
    @waleedbinowais7624 3 роки тому +1

    The best explanation of agglomerative clustering example on the internet. Thank you so much!

  • @sam9620
    @sam9620 3 роки тому +3

    thankyou ma'am, the best thing about your channel is that you don't miss any step during calculations which makes it easy to understand 💯

  • @biaoalex2018
    @biaoalex2018 Рік тому +1

    Thank you SOOOOOOOOO much!! You explained the solution MUCH BETTER than my tutors did, this is so helpful!

  • @adamatkins8496
    @adamatkins8496 3 роки тому

    Dr Bhatia, you are a gift to humankind!

  • @rafaynaeem7529
    @rafaynaeem7529 4 роки тому

    Mam I have watched 3 videos of yours single link, complete link and average link and finally I'm able to understand Thank you mam for teaching us.

  • @zeyadahmad4556
    @zeyadahmad4556 5 років тому +8

    Thank U so much. U helped me understand this Algorithm on the day before my Data mining final Exam =D

  • @Surya_Kiran_K
    @Surya_Kiran_K 2 місяці тому

    Thanks so much❤❤ saved a lot of time before the exam

  • @abhishekbiswas3420
    @abhishekbiswas3420 4 роки тому +3

    thank you so much ma'am, this video really helped me a lot to understand the numerical of agglomerative hierarchical clustering

  • @siddhigoyal9362
    @siddhigoyal9362 Рік тому +1

    Your explanatory skills are amazing ma'am. Please continue making more such learning videos.

  • @darshanshah752
    @darshanshah752 5 років тому +4

    Lovely explanation, I have my masters exam tomorrow, just cleared all the doubts regarding agglomerative clusters, Thank you !!

    • @FarhanAli-dq8eh
      @FarhanAli-dq8eh 5 років тому

      Our professor is making us learn this in a softmore data structures class and implement into a project...I thought it felt out of scope for our level

  • @maheshasisirakumara4025
    @maheshasisirakumara4025 5 років тому

    Wow. Clearly explained. Wonder if every lecture has the explanation ability like you

  • @mandeepkhadka388
    @mandeepkhadka388 4 роки тому

    Best explanation on Agglomerative Clustering till now, thank you maam!!! I am clear as a crystal now.

  • @amargupta7123
    @amargupta7123 Рік тому

    very nice video and easy explanation.
    Thank you so much, please keep uploading such videos.

  • @emmayu4177
    @emmayu4177 4 роки тому +2

    Thank you soooo much for your clear and detailed explanation!!!!♥ Never thought this concept could be so easy to understand!!~

  • @abhishekchourasia9252
    @abhishekchourasia9252 3 роки тому

    Very Great and helpful and Very Very Easy Explanation. Understood Each bit. Thank You.

  • @mvr192
    @mvr192 2 роки тому

    Eye opening, thank you very much Anu! Much much respect to you!

  • @adamglynn1323
    @adamglynn1323 5 років тому +3

    Thank you so much for this! Need to know this for an exam in 2 days!

  • @yukinaproductions5647
    @yukinaproductions5647 4 роки тому +3

    Thank you ! I am studying for my statistics exams. My Unis Documents on this matter are so bad. You really helped me understand it :)
    On to your next Video :)

  • @abhishekadivarekar2269
    @abhishekadivarekar2269 6 років тому +10

    Thank you for explaining how to solve this type of sums
    :-D
    Looking for more such videos..

  • @swagatmishra9350
    @swagatmishra9350 4 роки тому

    Thank you so very much for the neat and detailed explanation. You have explained each and every step so very clearly..
    Thank you very much.

  • @its_the_yogi
    @its_the_yogi 5 років тому +14

    Really great presentation. I'd like to suggest a minor correction at 04:57 - formula for Euclidean distance ((x-a) + (y-b))^1/2

  • @alexstefan8479
    @alexstefan8479 Рік тому

    I finaly understand HAC MIN. Thank you a lot!!!

  • @jyhunter4716
    @jyhunter4716 2 роки тому

    Thanks Ma'am...wonderful explanation and by this my exam syllabus got sorted...

  • @nasimmatar
    @nasimmatar 2 роки тому

    Thank you Anuradha Bhatia, very good explanation i hope to see more of your explanations on various topics related to machine learning and data mining

  • @premthapa9959
    @premthapa9959 2 роки тому

    The Euclidean distance formula : x is written in place of y.
    Perfect lecture
    Very helpful tho ma'am

  • @mohamedsanogho8361
    @mohamedsanogho8361 Рік тому

    this is old, but you just saved another cs student day!

  • @ranati2000
    @ranati2000 2 роки тому

    Perfectly explained the concept.

  • @ikurious
    @ikurious Рік тому

    Thank you so much, It saved my time a lot

  • @kyeelaquilariaty9234
    @kyeelaquilariaty9234 5 років тому +3

    Thank you so much it was really helpful for me to prepare final exam :)

  • @atkuriajaykumar3701
    @atkuriajaykumar3701 5 років тому +1

    thanks madam i dont know how to thank uu .you are doing lot of help means a lot.

  • @luksb1054
    @luksb1054 2 роки тому

    This is exactly what I was looking for. Thank you so much.

  • @fundamentalslearner7460
    @fundamentalslearner7460 11 місяців тому

    Perfect explanation. Superb

  • @k.m.emonahmed4938
    @k.m.emonahmed4938 3 роки тому +32

    when I calculated each distance in the calculator by the euclidean distance measuring formula, I have found lots of (0.01) differences in the calculation which can change the whole dendrogram, like in (p3,p2) you calculated 0.15 but it will 0.14.

    • @yegavintisumanth5347
      @yegavintisumanth5347 Рік тому +1

      Yes me too got the same (p3,p2) is 0.14 only..It will change the whole dendrogram

    • @nathanmcnulty
      @nathanmcnulty Рік тому

      Agreed -- the rounding errors in the distance matrix are problematic and should be corrected. Otherwise, those working through this example are likely to be confused when their own calculations yield different results.
      I would also suggest equalizing the axis scales for the x/y plot. The "stretch" in the x-axis relative to the y-axis is causing a visual disconnect between your calculated euclidean distances and the apparent closeness of points in the plot. For example, the (P2,P4) distance is 0.194 whereas the (P2,P5) distance is 0.143, but the axis scaling makes the (P2,P4) distance look like the shorter of the two.

    • @PranavMane-cg1cg
      @PranavMane-cg1cg Рік тому

      Bro it's not the fault of sums ,may be it's u ,ur dumb in basic maths so do the 6 7 grade again

  • @giovanni_ferreira
    @giovanni_ferreira 2 роки тому

    Splendid approach on your explanation. Thanks a lot!

  • @imaadullahafridi1928
    @imaadullahafridi1928 4 роки тому

    Wonderful explanation and concept delivery, please make more videos like that,
    Simply brilliant

  • @Talksofarjuun2405
    @Talksofarjuun2405 3 роки тому

    Best one thank you so much tomorrow is my exam and now I’m confident in this sum🤩

  • @ammarsinan2706
    @ammarsinan2706 2 роки тому

    Thank you, your explanation is straight to the point

  • @stelducam8622
    @stelducam8622 3 роки тому

    Thanks a lot for this video. I finally understood how this algorithm is performed. Thanks a lot

  • @pratheekhebbar2677
    @pratheekhebbar2677 2 роки тому

    your explanation is too good mam

  • @ramankumar41
    @ramankumar41 Рік тому +1

    thanks a lot ma'am, this is the best !!!!

  • @АлександраМеркульева-т6щ

    Thank you for this excellent explanation.

  • @forumshah4950
    @forumshah4950 5 років тому +2

    U explained it so easily mam❤️

  • @Debduttadas1997
    @Debduttadas1997 2 роки тому

    Thank you mem, it is too useful to study the numerical taxonomy . Thanks a lot mem

  • @obi6753
    @obi6753 3 роки тому

    you are a great teacher

  • @sheikashanurrahman7575
    @sheikashanurrahman7575 3 роки тому +1

    Very easy and helpful. thanks

  • @shreyansjain8028
    @shreyansjain8028 5 років тому +2

    Thanks for this video! It was very clearly explained..

  • @YueHuang_Olivia
    @YueHuang_Olivia 5 років тому +1

    Thank you sooo much. It's a very clear explanation of this content!

  • @rabia17417
    @rabia17417 Рік тому

    Finally I got this ... Thanks a lot

  • @mohammedalbluwi2046
    @mohammedalbluwi2046 2 роки тому

    Thanks a lot. You save my life!

  • @MandeepSingh-ny9ok
    @MandeepSingh-ny9ok 5 років тому

    Very good explanation!
    The quantitative value/level at which the 2 leafs or 2 clusters meet in the dendrogram should be made explicit.

  • @Vijai777
    @Vijai777 2 роки тому

    Excellent teaching..

  • @bobbyyankey5967
    @bobbyyankey5967 6 років тому

    Thank you for the presentation. Have a better understanding of dendograms now.

  • @sampathkengua1507
    @sampathkengua1507 3 роки тому +2

    What if we got two minimum values as same in distance matrix and then which one to consider first? Time at 5.45

  • @elvykamunyokomanunebo1441
    @elvykamunyokomanunebo1441 2 роки тому

    Thank you for this hand calculation it does a good job of explaining what's going on.
    Would be useful to get a python code where you do the calculations from scratch using python

  • @andrejborisow9426
    @andrejborisow9426 2 роки тому

    Thank you for your great explanation

  • @rashmimbendakaluru7225
    @rashmimbendakaluru7225 3 роки тому

    thank u madam, it helped me a lot to understand, its easy, more clear explanation. In the video at 5:03 time, formula is shown wrong, i think a typing mistake. one question having yet is, aft forming a cluster [p3,p6], in the dist matrix, we replaced p3 place with [p3,p6]. why cann't we replace p6 place with [p3,p6]....... thank u adv.

  • @boabdl
    @boabdl 4 роки тому

    You are a Life Saver. Thank You very much love.

  • @ganxiongyong1027
    @ganxiongyong1027 Рік тому

    after lecturer teach me this subject, I definitely not understand what she teach, but after I hear once time then already understand

  • @rheaserarodrigues
    @rheaserarodrigues 3 роки тому

    Thankyou maam for this video and explanation.... It really helped me❤️

  • @Mohit-nw5jr
    @Mohit-nw5jr 5 років тому +1

    Awesome explanation! Hope to see more content from you in the future!

    • @AnuradhaBhatia
      @AnuradhaBhatia  5 років тому

      Hope to see You in VIT, as I am invited as a session trainer.

    • @Mohit-nw5jr
      @Mohit-nw5jr 5 років тому

      @@AnuradhaBhatia Oh cool that's amazing! We are excited to have you here!

  • @afcmain
    @afcmain Рік тому +1

    I didn't get the thing that how are you calculating min of distance between cluster and a point

  • @arifromadhan2515
    @arifromadhan2515 3 роки тому +1

    thank you for explaining....

  • @mohamadhallak8644
    @mohamadhallak8644 3 роки тому

    It is really useful. Thanks a lot

  • @MayankKumar-qp5yy
    @MayankKumar-qp5yy 4 роки тому

    Very nicely explained

  • @akshitagarwal98
    @akshitagarwal98 5 років тому

    Thanks, Ma'am. I like your teaching methodology.

  • @faizaman9267
    @faizaman9267 5 років тому

    Ma'am, you have roughly drawn the dendogram. If we plot the dendogram by taking Thier respective values on X axis and Y axis, there will be completely different dendogram from what you hav drawn

  • @mohsala5498
    @mohsala5498 11 місяців тому

    great Videos Anuradha!!!!!!..... why you're not making any new videos for the last 4 years, your detailed explanation and attention to nuance is so amazing... I suggest you come back with more videos on ML & DL algorithms ...have a great day

  • @otaku5869
    @otaku5869 2 роки тому

    This is still a little comfusing, but I got it so much better now. Thank you!

  • @jahnavichivukula
    @jahnavichivukula 3 роки тому

    Wonderful explanation. On point👏🏻👌

  • @Niweera
    @Niweera 4 роки тому

    Thank you madam, this cleared a big doubt I had.

  • @230489shraddha
    @230489shraddha 2 роки тому

    Brilliant !! Very helpful

  • @manahel2084
    @manahel2084 5 років тому +1

    Great tutorial, You are awesome

  • @abhishekdewang4361
    @abhishekdewang4361 4 роки тому +1

    This teaching was better than my class teacher 😅

  • @kiran082
    @kiran082 4 роки тому

    Excellent explanation.Thanks a Ton

  • @pashockz8990
    @pashockz8990 4 роки тому

    Thank you very much. Such an great explaination.

  • @fatamatojjohora3755
    @fatamatojjohora3755 Рік тому

    Well explained .Thanks a lot.

  • @templar273
    @templar273 2 роки тому

    Amazing explanation, i thank you so so so much!

  • @shvang8
    @shvang8 5 років тому

    Very nice explanation Keep up the good work Glod Bless you !!

  • @ahmedelsabagh6990
    @ahmedelsabagh6990 3 роки тому

    Very clear and helpful.

  • @diego.777ramos
    @diego.777ramos 5 років тому

    excellent example , the best in UA-cam :)

  • @atanpkabiriera4113
    @atanpkabiriera4113 6 років тому

    thank you for sharing. very simple and clear tutorial

  • @stargamer9915
    @stargamer9915 3 роки тому

    Awesome explanation thank you 😊

  • @dipusaha7133
    @dipusaha7133 5 років тому +1

    At 5:30 (P1,P2) is 0.23 and at 5:33 (P1,P2) becomes 0.24 .... why so ??

  • @anuroopks7104
    @anuroopks7104 8 місяців тому +1

    Thank you so much.

  • @vishwadeepbalyan9809
    @vishwadeepbalyan9809 4 роки тому

    excellent explanation .. thanks a lot

  • @raufkhamosh8888
    @raufkhamosh8888 5 років тому

    Thank you so much for the best explanation. The link you shared is not working. It would be great if you could share such videos related to the topic.

  • @uarangat
    @uarangat 4 роки тому

    Many thanks mam. Nice tutorials

  • @xiaofangchen934
    @xiaofangchen934 4 роки тому

    Thanks. It's pretty helpful. Help me understand how the dendrogram comes on book

  • @nichlaspetersen2795
    @nichlaspetersen2795 7 років тому

    Thanks for the clear explanation! Keep up the good work

  • @marcdomingo6590
    @marcdomingo6590 2 роки тому +1

    you're the best :)

  • @geethakrishnasamy4838
    @geethakrishnasamy4838 7 років тому

    Too good Ms Bhatia...

  • @mathieuteushi999
    @mathieuteushi999 2 роки тому

    Thank you, great video

  • @arnav0708
    @arnav0708 5 років тому

    very nicely explained..thank you ma'am.