Hierarchical Agglomerative Clustering [HAC - Complete Link]

Поділитися
Вставка
  • Опубліковано 17 лис 2024
  • Data Warehouse and Mining
    For more: www.anuradhabha...

КОМЕНТАРІ • 102

  • @chaosNinja790
    @chaosNinja790 8 місяців тому +3

    You saved me, I got 37/40 in data mining. Thank you❤

  • @wessauder7708
    @wessauder7708 7 років тому +8

    Thank you for this walkthrough! It is very well done. I was looking every where to find an example of how to update the cluster matrix and this really helps. It is so well done and is extremely clear. Thank you for this series on clustering.

  • @lancezhang892
    @lancezhang892 2 роки тому

    Finally I understand the approach about how to merge these points.

  • @s.r.3924
    @s.r.3924 2 роки тому

    You are saving my exam in data mining. Thank you very much!

  • @mahdi_shahbazi
    @mahdi_shahbazi 2 роки тому

    God bless you for this simple yet informatic explanation of Agglomerative Clustering

  • @mutebaljasem9734
    @mutebaljasem9734 3 роки тому

    SHe explained the Hierarchical Agglomerative Clustering very well. Big Thank

  • @amarimuthu
    @amarimuthu 7 років тому +7

    Hi at 3.50 , the euclidean distance should have 'y-b' instead of 'x-b' for the second value. Thanks and Nice explanation.

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому +1

      Yes Sir,
      Thanks

    • @amarimuthu
      @amarimuthu 7 років тому

      Anuradha Bhatia Thank you for your teaching and it helps students like me in easily understandable steps.

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому +5

      Thank you so very much for motivation.

  • @ShortJokes4U-p9t
    @ShortJokes4U-p9t 4 роки тому +5

    good lecture, a small mistake at 4:05 the euclidean distance formula should be sqrt(sum(x - a)^2 + (y - b)^2)

  • @Mynameisjoof
    @Mynameisjoof 7 років тому +1

    Best explanation of complete linkage I have found. Thank you so much!!!!

  • @jules_tbl1010
    @jules_tbl1010 3 роки тому

    Thank you. I've been studying from a manual and this method not not even close to the explanation

  • @vanlalhriatsaka8054
    @vanlalhriatsaka8054 5 років тому +1

    Very clear explanation ma'am....and ma'am one big request please continue to do by centroid method....

  • @priyankkharat5686
    @priyankkharat5686 7 років тому +1

    It's very helpful for our ongoing exams....thank you so much ma'am.

  • @mahakalm395
    @mahakalm395 5 днів тому

    My all doubts are clear now thx you so much. :)

  • @dcharith
    @dcharith 5 років тому +1

    Thank you very much for taking the time to post these really helpful videos!

  • @HugoRamirezSoto
    @HugoRamirezSoto 7 років тому +2

    I would like to thank you for this video. Your explanation is magnificent and so clearly. You helped me a lot to comprehend these complex subjects. Greetings from Mexico.

  • @mythicallygold1625
    @mythicallygold1625 5 років тому +2

    i like this video, good explanation, good step by step guide, i say more effective than what my teacher taught, thumbs up :3

  • @mahmoudelkafafy9982
    @mahmoudelkafafy9982 5 років тому +3

    Hello, Thanks a lot for the simple and clear explanation for the single linkage (previous video) and for the complete linkage as well. I have two questions. 1) Looking at the dendrograms obtained from the single linkage and the complete linkage, one can see that they are different. So , how can we interpret that? If I cut the tree at the same value (i mean for the single and complete linkage trees), I would obtain different clusters results. 2) What is the idea behind searching for the maximum distance in case of complete linkage?

  • @k.kaushikreddy1792
    @k.kaushikreddy1792 4 роки тому

    Very lucid explanation. Keep up the great work !

  • @KalusivalingamThirugnanam
    @KalusivalingamThirugnanam 5 років тому +1

    Thanks Mam for explaining this. Very useful.

  • @akashr9973
    @akashr9973 3 роки тому

    Thank you madam, very convincing explanation!

  • @ashishdevassy
    @ashishdevassy 5 років тому +5

    this was very helpful.. thank you

  • @vmudivedu
    @vmudivedu 6 років тому

    Thank you Maam. That was a clean video and helped me a lot understanding the Complete-Link. I have a few questions question..
    1. How does the merge criterion influence the merge decision?
    2. Why is this complete link clustering called non-local while the single link criterion called local?

  • @nononnomonohjghdgdshrsrhsjgd
    @nononnomonohjghdgdshrsrhsjgd 4 роки тому

    Very good explanations! Can you please Show an example how to Use the correlation matrix as a distance matrix in kmeans. You have applied the euclidian distance in k means to cluster. How does the calculation of the Clusters work, with taking not the original dataset, but having the correlation matrix. How to Use the corr Matrix to Bild k means clusters? Thank you!

  • @arultherule
    @arultherule 3 роки тому

    Thanks a ton for a fantastic explanation madam!
    When we pick to first start the merging process, shouldn't we pick P6 and P5 to merge first since it has Max value 0.39?

    • @MrChabonga
      @MrChabonga 3 роки тому

      The first cluster is determined through the most similar units. After that we define the distance from that cluster to the other data points through either single linkage (looking at minimal distance) or complete link (looking at maximum distance)

  • @bhartinarang2078
    @bhartinarang2078 7 років тому +2

    COMPLETE LINK - it means, while calculating distance matrix, we take the maximum value, right?
    SINGLE LINK - while calculating, distance matrix, we take the minimum value?

  • @tranminhtien172
    @tranminhtien172 5 років тому

    Pretty useful video! Can you share this slide?

  • @laxmivyshnavidokuparthi4164
    @laxmivyshnavidokuparthi4164 6 років тому

    Superb explaination .

  • @Kaidanten
    @Kaidanten 4 роки тому +1

    does the formular at 3:44 contain a typo? should it be (y-b)^2? not (x-b)^2?

  • @jobaidajarin356
    @jobaidajarin356 3 роки тому

    Thank you ma'am. It helps a lot

  • @hy8040
    @hy8040 4 роки тому

    really helped! easy to understand the concept! thanks~

  • @guliteshabaeva8922
    @guliteshabaeva8922 Рік тому

    very very clear, thank you!

  • @DeepakPatter
    @DeepakPatter 6 років тому

    clean and precise video. Really helped. Thank you

  • @sumanthkumar4035
    @sumanthkumar4035 3 роки тому

    why did we start with p3 and p6 ?shouldn't we start with the pair which has max distance between them?

  • @muhammadfirdaus7278
    @muhammadfirdaus7278 6 років тому

    Thank you for the amazing explanation.

  • @sebastiantischler8410
    @sebastiantischler8410 2 роки тому

    What happens when you update your distance matrix and then there are two (or more) minimum values?

  • @benjaminmusasizi3777
    @benjaminmusasizi3777 5 років тому

    Thanks mam. Very well explained!!

  • @yerramillihemanth2998
    @yerramillihemanth2998 2 роки тому

    Can you please explain what to do when the matrix has two same low value (eg: If P2 and P1 has 0.12 and P3 and P4 has 012). In that case which points need to be considered?

  • @SelenaFriend
    @SelenaFriend 5 років тому +1

    thank you so much for this, it really helped me!

  • @bhaskarp1063
    @bhaskarp1063 3 роки тому

    Quick Question. How do we merge when the index of min element is 1,0 or 0,1

  • @annmaryjoseph684
    @annmaryjoseph684 3 роки тому

    Thank you very much

  • @alxjf
    @alxjf 6 років тому

    Helped me a lot. Thank you.

  • @quicklook3908
    @quicklook3908 4 роки тому

    mam i have a question: should we consider the least value or should consider the least value from the lower bound of the distance matrix

  • @tamilarsang.s1426
    @tamilarsang.s1426 7 років тому +2

    Try to upload classification sums naive Bayes ,bayessian and id3
    ur video is very help full mumbai university students. ..try to solve it in same method followed by them .....Thanks

  • @uarangat
    @uarangat 6 років тому

    Thanks,clear explanation.

  • @EvelynJenkins-yu5wi
    @EvelynJenkins-yu5wi 3 роки тому

    what to do if there are two same smallest elements?

  • @SamirAliyev771
    @SamirAliyev771 4 роки тому

    Salam. Thanks a lot :). Excellent job.

  • @payelbanerjee9192
    @payelbanerjee9192 7 років тому +1

    dear mam, i would like to know that sometimes cases appear where after computing the similarity matrix we find two lowest distances . Now we can choose anyone of the distances to merge at that step. Now this decision may affect the cluster output at the final stage . Well here am talking about the case when a distance threshold is applied . say for eg-{1,2,3,4,9,8,7}.here if we take a threshold of 1 , then the clusters are {1,2},{3,4},{9},{8,7}.The clusters can also be {1,2},{3,4},{8,9},7.Any solution to these problem ??please reply . thanks .

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому

      Hello Madam,
      The two clusters can be formed with threshold 1. During implementation in the real world problem with clustering the other factors are also considered along with these factors.

    • @payelbanerjee9192
      @payelbanerjee9192 7 років тому

      ok, so both of them can be the answers .. am i right?? now if any other factors are taken into consideration then we have to choose a single one..

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому

      Right.

  • @kg3217
    @kg3217 3 роки тому

    Why do we call it "complete" and "single" linkage ? In both videos the difference was minimum and maximum distance to take after making the cluster, is there any other logical reason behind that naming ?

  • @fauzidaniqbal2564
    @fauzidaniqbal2564 4 роки тому

    what if there is more than one smallest value? example : both value of (1,4) and (2,5) is 1

  • @atulgupta-sl1zw
    @atulgupta-sl1zw 7 років тому

    dear mam,
    if we given a similarity matrix instead of distance matrix then what will be the approach?
    regards
    Atul

  • @pratiksharma1655
    @pratiksharma1655 5 років тому

    Amazing... Cheers

  • @ashtonuranium2994
    @ashtonuranium2994 4 роки тому

    Thank so much...

  • @ameliaachung
    @ameliaachung 5 років тому

    Lifesaver!! Thank you :)

  • @mahirkhan4124
    @mahirkhan4124 6 років тому

    The answer or the final dendogram of both complete and average link can be same ?????

  • @jmg9509
    @jmg9509 2 роки тому

    3:15 - Complete Linkage

  • @ruler5408
    @ruler5408 4 роки тому

    Awesome

  • @payelbanerjee9192
    @payelbanerjee9192 7 років тому

    mam, everywhere it is written that the space complexity of naive Hierarchical Complete Linkage
    clustering algorithm is O(n) but as far as I know that if all the pairwise distances are stored for calculating the distance matrix then the space complexity should be O(n2).Will you plz let me know that why the space complexity is O(n)??

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому

      As every distance is computed and used exactly once.

    • @payelbanerjee9192
      @payelbanerjee9192 7 років тому

      would you please clarify slightly ..

  • @farahadilah4694
    @farahadilah4694 6 років тому

    hi, why is it that we always have to find the minimum value in the lowest bound but at the last stage (11:53), just finding smallest value in the whole distance matrix?

    • @AnuradhaBhatia
      @AnuradhaBhatia  6 років тому

      farah adilah lower bound...so smallest

    • @farahadilah4694
      @farahadilah4694 6 років тому

      oh.. so no matter complete/single/average linkage, always take the smallest value?

  • @venkateshvelagapudi5240
    @venkateshvelagapudi5240 6 років тому

    can anyone please share me the code for hierachical clustering

  • @zixiaozong2048
    @zixiaozong2048 5 років тому

    helps alot

  • @georgygursky4303
    @georgygursky4303 5 років тому

    Thank you!

  • @shubhamnayak9369
    @shubhamnayak9369 7 років тому

    Thank you madam

  • @mohammedsiraj673
    @mohammedsiraj673 7 років тому

    Thank you for the clear explanation :)

  • @satyamas3886
    @satyamas3886 5 років тому

    thanks

  • @NtinosParas
    @NtinosParas 5 років тому

    well done .. thank u :D

  • @djzero669
    @djzero669 5 років тому

    Thanks! =D

  • @СергейВакульчик-с6п

    Thanks!

  • @nileslystatozero9869
    @nileslystatozero9869 5 років тому

    ❤️

  • @looploop6612
    @looploop6612 6 років тому

    y-b

  • @nisajafernando3767
    @nisajafernando3767 3 роки тому

    Thank you !