K-Mean Clustering

Поділитися
Вставка
  • Опубліковано 9 лют 2025
  • Data Warehouse and Mining
    For more: www.anuradhabha...

КОМЕНТАРІ • 209

  • @sdelagey
    @sdelagey 4 роки тому +51

    Finally somebody that actually show calculations every step! Thank you so much you have my like!

  • @MechBasketMkII
    @MechBasketMkII 3 роки тому +6

    3 years and this is still very much useful! Thank you so much.

  • @phanurutpeammetta2066
    @phanurutpeammetta2066 4 роки тому +7

    you saved my life! wish me luck. I'll have an exam in the next 2 days. Hopefully, I can utilize all that you taught in this video. Keep on great work maim!

  • @IlliaDubrovin
    @IlliaDubrovin Рік тому

    OMG, I've been looking for this for so long!!! You are the QUEEN!!!

  • @Me-dq5eo
    @Me-dq5eo 4 роки тому +5

    I Finally understand the math behind this process. Thank you for walking through with actual data. This helps tremendously!

  • @joeaustinathimalamaria624
    @joeaustinathimalamaria624 6 років тому +2

    Helped a lot. Can't thank this lady enough. Just a small correction : Distance is (x-a)^2 + (y-b)^2

  • @johnmosugu
    @johnmosugu 2 роки тому

    You made this look so clear and understandable. I sincerely appreciate you for this all-important K-means computation video!

  • @just_a_viewer5
    @just_a_viewer5 Рік тому +1

    4:53 - 15^2 is 225, not 255

  • @PritishMishra
    @PritishMishra 3 роки тому +1

    Mam, the explanation was CRYSTAL CLEAR. Thanks! keep making these types of tutorials. It really really helps

  • @gopalakrishna_chinta
    @gopalakrishna_chinta 2 роки тому

    Your way of Explanation is easy to grasp Maam, Thank you 😇

  • @0youtubing0
    @0youtubing0 6 років тому +42

    Thanks for the informative video ! @3:15, the variable should be 'y', instead of 'x', (y-b)

  • @surbhiagrawal3951
    @surbhiagrawal3951 4 роки тому +1

    The best video seen so far on K-means

  • @akankshamishra1139
    @akankshamishra1139 Рік тому

    Thank you Anuradha ji. Finally I understood what is K, what is mean, what is centroid, what is euclidean distance. Please create more videos covering major ML algorithms.

  • @shalinisoni4257
    @shalinisoni4257 4 роки тому

    Your vedio Easily Understand.... Very Nice ma'am

  • @paulourbanosegoper1216
    @paulourbanosegoper1216 2 роки тому

    woaaaa really really help me to understand more about K-mean

  • @PanashePhotography
    @PanashePhotography 3 роки тому +1

    Really straightforward and easy to understand.

  • @OviPaulDgRimJoW
    @OviPaulDgRimJoW 7 років тому +4

    thank you very much, I was really confused how the implementation of this algorithm would be but you made it really easy to understand.

  • @certguide
    @certguide 3 роки тому

    This example is a bit difficult, you can simply take 2 rows directly and group it as 1 & 2, then find the Eucledian distance from each row and the shortest distance is your new row's group from 1 & 2, and the mean will be the grouped rows sum/2.
    Now the real concept who might go through my comment, K- means clustering is finding the way to group the similar set of data (of any type actually), then why we need a mean here?
    1. When you calculate the distance from one point to other point you simply take a-b ( and you know that a>b, however this may not be possible in graphs or 3 dimensional plot, so you take square of sum of the distances for 2 values x,y and then you take a root so that if in a-b a

  • @justateenager9773
    @justateenager9773 4 роки тому

    Really Excellent Mam...

  • @nehac9035
    @nehac9035 4 роки тому

    Formula of calculating Euclidean distance is needs to update as it contains (x-a)+(x-b) but it should be (x-a)+(y-b) also check square of 17, it should be 283 instead of 283.

  • @varadaraajjv4973
    @varadaraajjv4973 4 роки тому

    Excellent Video....Thanka lot mam...you saved my time

  • @rajakcin
    @rajakcin Рік тому

    Thanks for nice explanation, it helps.

  • @kavibharathi2913
    @kavibharathi2913 2 роки тому

    Very very useful 👍 Thank u so much......💫

  • @20shwetha
    @20shwetha 2 роки тому

    very very useful video thank you so much madam.

  • @nimrafaryad4103
    @nimrafaryad4103 2 роки тому +1

    Thanks mam 👍🏼Jazak Allah mam

  • @prasadnagarale6274
    @prasadnagarale6274 5 років тому +4

    i think you did it for one iteration only; but in next iteration maybe any point can change its cluster as the two means are changed.
    So basically we need to reiterate same procedure unless cluster mean value does not change for two consecutive iterations.

    • @prasannavi1911
      @prasannavi1911 5 років тому

      Prasad Nagarale agreed. I have one question.which centroid values we should consider for next iteration..

  • @pratheekhebbar2677
    @pratheekhebbar2677 3 роки тому

    a big thanks to you for this wonderful explanation

  • @coolbreeze007
    @coolbreeze007 Рік тому

    Thanks for the amazing job.

  • @dnfac
    @dnfac 4 роки тому +1

    Really simple and clear to understand, congratulations!

  • @Lens_lores
    @Lens_lores 6 років тому

    Thank you, Anuradha for such a comprehensive example.

  • @nadellagayathri
    @nadellagayathri 3 роки тому

    Anuradha great work. No where I got this detailed explanation.Please try to do videos for deep learning algorithms in a detailed way like this.

  • @RAVI2012ification
    @RAVI2012ification 4 роки тому

    awesome explanation

  • @amilcarc.dasilva5665
    @amilcarc.dasilva5665 5 років тому +3

    Thanks a lot. Systematic explanation and crystal clear.

  • @sachinjagtap8936
    @sachinjagtap8936 2 роки тому

    Great stuff, thanks for explaining.

  • @jiayuliao4358
    @jiayuliao4358 5 років тому +1

    very clear explanation!

  • @sathwik98
    @sathwik98 6 років тому +2

    Thanks for being my Savior for 10 marks.

  • @AnselmGriffin
    @AnselmGriffin 3 роки тому

    Are you sure your values are right ? At 9 min 31 secs the new K1 should be (185+179+182)/3 and (72+68+72)/3 . Also at 10 mins 20 secs k1 = (185 + 179 + 182 + 188)/4 and k2 = (72+68+72+77)/4 final centroid should be 183.5000 72.2500 for K1 and 169.0000 58.0000 for K2. I ran it in Matlab and Matlab confirms these answer.

  • @tshende02
    @tshende02 2 роки тому

    Thank you so much mam,, love 😍 ❤️

  • @janaspirkova4181
    @janaspirkova4181 6 років тому +1

    Dear Anuradha, thank you so so much.

  • @kowsisweety9113
    @kowsisweety9113 5 років тому +4

    Hi mam,
    Good and neat explation for k- means algorithm it was very useful for me
    I need a explanation of CLARA and CLARANS in partitioning algorithm for my exam ..

  • @shabnamparveen6785
    @shabnamparveen6785 3 роки тому

    very good session...but there should be X-a and Y-b.

  • @lavanyarajollu4122
    @lavanyarajollu4122 4 роки тому +1

    Best one from all others👏

  • @mkClipsHub
    @mkClipsHub 2 місяці тому

    great madam

  • @ashutoshanadkarni4588
    @ashutoshanadkarni4588 6 років тому

    Well explained. Just a minor suggestion. Most people watch on mobile so would be good to use entire screen rather than static title on left. I liked vedio

  • @LetYT1999
    @LetYT1999 4 роки тому

    Thank you mam for detailed explanation

  • @nynebioglu
    @nynebioglu 5 років тому +1

    Great explanation for K-means!
    Thanks.

  • @yashthaker9288
    @yashthaker9288 6 років тому +2

    Thank you so much ma'am for amazing explanation!

  • @navulurinitishkumar3911
    @navulurinitishkumar3911 2 роки тому

    Can we taken randomly any two initial centroids ?

  • @mailanbazhagan
    @mailanbazhagan 6 років тому

    great and easily understandable explanation.

  • @br1batman287
    @br1batman287 2 роки тому

    In the next dataset means 3rd one the value should be 17 square should be 289 it is 283 Ik the answer is correct just for informing

  • @hafdeabdeali7334
    @hafdeabdeali7334 3 роки тому

    Thanks a lot 🥰🥰🥰🥰

  • @computology
    @computology 2 роки тому +1

    Why you are updating Mean of cluster after every assignment? Aren't we suppose to update the mean after completion of single-one iteration according to the original Algorithm?

  • @tucuptea3689
    @tucuptea3689 Рік тому

    Ma'am, how do we know that we have assigned the initial 2 items in the right clusters?

  • @liyaelizabethantony5675
    @liyaelizabethantony5675 3 роки тому

    ar 3:50 its y-b ( euclidean distance) and not x-b

  • @anshumansingh6969
    @anshumansingh6969 5 років тому +8

    mean is calculated in a wrong manner ,we have to take avg of all value in our set when ever some new value is added..

    • @tahaali01
      @tahaali01 5 років тому +2

      you are correct

    • @vaddadisairahul2956
      @vaddadisairahul2956 4 роки тому

      I think mean is not calculated every step based on yours and her explaination. First , we assign all the data points to their nearest cluster and then take average of all the points in a cluster as a whole

  • @vigneshwaravr3283
    @vigneshwaravr3283 3 роки тому

    Fully explained

  • @bhartinarang2078
    @bhartinarang2078 7 років тому

    Wow 💪 Now we will have DWM vidoes. thanks madam. please keep them coming, your content is helping us and yes the BDA paper was lengthy, but your vidoes covered 30 marks or more altogether. Page ranks, sums, FM Algorithm.

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому

      Thanks,
      Will surely put .

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому +1

      Uploaded, Hierarchical Agglomerative Clustering, and Apriori Algorithm.

    • @bhartinarang2078
      @bhartinarang2078 7 років тому

      Anuradha Bhatia yes madam, I got the notification :) thanks.

    • @AnuradhaBhatia
      @AnuradhaBhatia  7 років тому +1

      More following.

    • @bhartinarang2078
      @bhartinarang2078 7 років тому

      Anuradha Bhatia That's great. Waiting eagerly :)

  • @vashmchannel7266
    @vashmchannel7266 4 роки тому +1

    Love's urs lecture

  • @linguafranca7834
    @linguafranca7834 2 роки тому

    thanks maam.

  • @dasarojujagannadhachari8015
    @dasarojujagannadhachari8015 6 років тому +1

    Wonderful lecture mam.. thank you

  • @roshanmagdum3822
    @roshanmagdum3822 5 років тому

    thanks mam for explanation

  • @MegaDk13
    @MegaDk13 6 років тому

    cluster assignment for the first 2 clusters is an assumption though we can justify it by euclidean distance calculation

  • @mprasad3661
    @mprasad3661 5 років тому

    Great explanation madam

  • @annaet7769
    @annaet7769 5 років тому +1

    Thank you😊

  • @nikhilmkumar2765
    @nikhilmkumar2765 6 років тому

    Thanks a lot, ma'am. Helped me for my exam!

  • @jiyuu329
    @jiyuu329 4 роки тому

    what is the point of finding the distance between the two initial clusters?
    the pts themselves are the centroids for their respective cluster, right?

  • @Naweeth03
    @Naweeth03 5 років тому

    Hi mam,
    i want answer to this question -- Assume you are given n points in a D-dimensional space and a integer k. Describe the k-means ++ algorithm for clustering the points into k cluster

  • @RishabVArun
    @RishabVArun 4 роки тому +1

    Don't update cluster centroid after every assignment update it after a whole iteration (all assignments in one iteration is complete)

  • @pranavyadav4597
    @pranavyadav4597 4 роки тому

    madam can you put more topics in data warehouse on youtube .

  • @yashigarg5517
    @yashigarg5517 6 років тому

    Very well explained😘

  • @anuragsingh5850
    @anuragsingh5850 3 роки тому

    thank you

  • @MrHardrocker98
    @MrHardrocker98 4 роки тому

    why u didn't do update means as you did in single dataset video?

  • @aleeibrahim8672
    @aleeibrahim8672 6 років тому +2

    Shouldn't we update the centroid when we find all the distances between every data point and the previous centroid? you updated it with the average of the first two points only, why?

  • @sujathasivakumar6422
    @sujathasivakumar6422 5 років тому

    Well explained mam tq

  • @SantoshRaj-hx7rx
    @SantoshRaj-hx7rx 6 років тому

    Thanks for the explanation.. it's is very clear...

  • @PR-ql7tg
    @PR-ql7tg 3 роки тому

    4:23 wrong formula, you forgot the y and replaced it by x

  • @Uma7473
    @Uma7473 5 років тому +1

    Thank you 👏👏👏🙏👼

  • @kamalb3326
    @kamalb3326 7 років тому

    Superb explanation. Thank u

  • @APREETHAMCS
    @APREETHAMCS 4 роки тому +1

    Great! Thank you for the video!

  • @tapanjeetroy8266
    @tapanjeetroy8266 5 років тому

    Thank you mam

  • @ghoshdipan
    @ghoshdipan 4 роки тому

    How can we find the K value for a large data set?

  • @ismailkarnanokung4753
    @ismailkarnanokung4753 6 років тому

    thanks for everything this video is very instructive...

  • @spandanamanoj4890
    @spandanamanoj4890 6 років тому

    Wonderful lecture Mam... For more videos, the link is not working... Kindly post more videos on Datamining. Thank you.

  • @TheKnowledgeGateway498
    @TheKnowledgeGateway498 4 роки тому +1

    What was the relevance of (0,21.93) values. There was no point of calculating that.

  • @FunmiOg
    @FunmiOg 6 років тому

    Thank you so much ma. Very helpful video.

  • @panostzakis6925
    @panostzakis6925 2 роки тому

    .Thanks for your help .Im appreciate for your time ..But maybe there is a loose in Euclidean Distance [(x,y),(a,b)]=root of(x-a)^2+......(y-b)^2 and not ...(x-b)^2.as my point of view!!

  • @heartborne123
    @heartborne123 4 роки тому

    what is the point to calculate distance from centroid 1 to centroid 1 and from centroid 2 to centroid 2? isn't it obvious distance in this case gonna be 0 ?

  • @mariomartinsramos6450
    @mariomartinsramos6450 7 років тому

    Best explanation ever about k-means!
    Thanks!!

  • @agnimchakraborty1112
    @agnimchakraborty1112 4 роки тому +1

    Ma'am in which cluster should we assign the coordinate(or data point) if the euclidean distance is the same from both the clusters?

  • @mahmoudabdulhamid4182
    @mahmoudabdulhamid4182 10 місяців тому

    how can i have this slides

  • @shalinisoni4257
    @shalinisoni4257 4 роки тому

    Mem Cosine Similarity Ka Sum Ka Examples How to find.... Ek Vedio Bnaake Rakho.... Plz

  • @saurabhzinjad
    @saurabhzinjad 5 років тому +8

    15 square is 225. Not 255. Please focus on small mistakes. Mean calculation is also wrong.

    • @alex-ek8vt
      @alex-ek8vt 4 роки тому

      Ich schick dich gleich ins Vakuum!

  • @diego.777ramos
    @diego.777ramos 5 років тому +1

    excelente canal muchas gracias por tu conocimiento , like y suscripcion.

  • @NandDulalDas2810
    @NandDulalDas2810 6 років тому +2

    simply wonderful/easy to understand

  • @mebrunoo
    @mebrunoo 6 років тому +1

    thank you for good and detailed explained

  • @yusufbaseera1731
    @yusufbaseera1731 4 роки тому

    Please how can one select cluster head after getting the nodes into clusters. Please help me out

    • @fardeen2158
      @fardeen2158 4 роки тому

      have to select random ones . better select any initial one '

    • @yusufbaseera1731
      @yusufbaseera1731 4 роки тому

      @@fardeen2158 please help me out properly, I don't understand selecting the initial one please

  • @neenapiiii
    @neenapiiii 5 років тому

    thanku

  • @coolestcatintown1501
    @coolestcatintown1501 6 років тому +1

    Perfectly clear, thank you very much.

  • @x2diaries506
    @x2diaries506 6 років тому

    @Anuradha Bhatia can you help me to apply K-Mean Clustering in Localization using VLC

  • @xc295
    @xc295 3 роки тому

    The centroid coordinates are continuously changing. Initially, we took co-ordinates of points 1 and 2 as two centroids, so should we not re-check if points 1 and 2 still belong to the initial cluster to which they were assigned?