Introduction to the Multinomial Distribution

Поділитися
Вставка
  • Опубліковано 24 гру 2024

КОМЕНТАРІ • 149

  • @JoeSwansonsLegs
    @JoeSwansonsLegs 2 роки тому +62

    10 years later, nothing has been able to beat this explanation. Very much appreciated!

  • @jbstatistics
    @jbstatistics  11 років тому +75

    Thank you! I'm very glad you find these videos useful. I created them mainly with my own students in mind, but I'm very glad that others around the world find them helpful.

  • @velocitylad2347
    @velocitylad2347 9 місяців тому +2

    had to login and leave a comment, your videos are a gem. thanks for saving my probability and statistics course!

  • @jbstatistics
    @jbstatistics  11 років тому +32

    Thanks very much for the compliment. It's not the easiest thing in the world to do well, and there are a number of skills and talents required, so it's not too surprising there can be some poor quality stuff out there. I don't profess to be an expert in videos, but I'm doing my best (on my own) to make the best ones I can. Some of my earlier videos were plagued by some of the negatives you mention, but I've refined my technique a little since then. I'm very glad you find them helpful.

  • @natureshorts6657
    @natureshorts6657 2 роки тому +7

    Statistics grad student here. Your videos are always so incredibly clear. It is so helpful to have an example alongside the general mathematical notation to understand when we would use this distribution and to have everything worked out. You are so incredibly helpful. Thank you!

  • @jbstatistics
    @jbstatistics  11 років тому +5

    Thanks so much! I'm definitely going to keep going -- I'll be adding new videos in the coming weeks and months. Cheers from Canada.

  • @animalamor915
    @animalamor915 11 років тому +1

    just know that your an exceptional teacher and without your videos I would be lost all quarter!!!! Thank you so much!

  • @switchwithSagar
    @switchwithSagar 8 місяців тому +1

    This playlist is a gem. Came here looking for multinomial distribution after using it for sampling in Pytorch.
    Following explanation on the number of possible orderings might help myself (in future) and fellow learners -
    Example - Number of ways to permute/order “n” letter word = n!
    Suppose word = AAAA. Possible orderings = 4!, but as all letters in the word are the same (not distinct i.e. even if I change the position of any of the A’s it won’t matter), divide by 4!. Suppose word = “AABC” then orderings = 4! / 2!, divide by 2! because letter A is same(not distinct).
    Same case here, in the case of sampling 10 Americans. The number of ways to permute/order = 10!, but out of 10 people, 6 are of type O, 2 are of type A, 1 of type B, and 1 of type AB. Hence, the people in each group (O, A, B, AB) are the same(non-distinct). So, 10! / (6! * 2!)

  • @scarfacek
    @scarfacek 8 років тому +11

    You are awesome. You even showed without replacement which shows you are very thorough in your explanations.

  • @fbrown18
    @fbrown18 10 років тому +18

    Hands down the best explanation on this concept period. The example was so clear and easy to follow. Where's the "Donate" button. I'll gladly chip in for your efforts.

  • @derrickhuang4579
    @derrickhuang4579 3 місяці тому

    This 10 min video alone explained a whole week of lecture content 10x better than my professor did.

  • @jbstatistics
    @jbstatistics  11 років тому +1

    You are very welcome, and thanks for the compliment! I'm very glad to be of help.

  • @jbstatistics
    @jbstatistics  11 років тому +3

    That's correct. There's nothing wrong with analyzing a situation that is truly multinomial as a binomial, if one is interested in only one of the outcomes. For example, we might be interested in the distribution of the number of people with O- blood in a random sample of 100 people. This has a binomial distribution with 2 possible outcomes on each trial (O-, not O-), even though there are many other blood types. There's nothing wrong with doing this, if that's the question of interest.

  • @jbstatistics
    @jbstatistics  11 років тому +4

    Yes, X_1 through X_k are random variables, and collectively can be thought of as a k-dimensinoal random vector.
    Not all random vectors have a multinomial distribution of course, and we'd only use the multinomial distribution to find probabilities if the conditions of the multinomial distribution are met.

  • @p4n0z
    @p4n0z 11 років тому +2

    that's exactly what I was thinking!
    Excellent simple intros!
    Thank you so much JB!

  • @SalsaTiger83
    @SalsaTiger83 10 років тому +28

    Very nice voice, very good explanation, clear presentation. Love it!

    • @jbstatistics
      @jbstatistics  10 років тому +4

      Thanks for all the compliments! I'm very glad to hear that you love my video! All the best.

    • @dwivedys
      @dwivedys 7 років тому

      jbstatistics do you have a lesson on gamma distribution?

  • @ZbiggySmall
    @ZbiggySmall 5 років тому +5

    I enjoyed every single video in this playlist. Your approach really helped to understand what random variables are. It is also very explicit in your videos what are possible values of rv given a distribution. It looks so trivial now but was very confusing before. Many thanks for that.

  • @jeronimocid51
    @jeronimocid51 4 роки тому +1

    Thank you! People are still using this! please keep going! : D

  • @jadrianism
    @jadrianism 7 років тому +2

    I passed the quiz,that is why I am here to thank you and it helped

  • @carmineiuorio2638
    @carmineiuorio2638 5 років тому +1

    excellent video...I found it incredebly straightforward, clear and complete. You're a great teacher!

  • @TheJProducti0ns
    @TheJProducti0ns 3 роки тому

    Short and Straight to the point. Thank you good sir.

  • @kunalmohan2689
    @kunalmohan2689 10 років тому +1

    Thanks. All your lectures are great. Spent the whole day going through most.

    • @jbstatistics
      @jbstatistics  10 років тому +1

      You are welcome kunal. I'm very glad you find my videos helpful.

  • @rushi7627
    @rushi7627 2 роки тому +1

    Everything about ur teaching is classic....tnx ❣️ Love from India ❣️

  • @ThESpOt112
    @ThESpOt112 9 років тому +4

    Thank you so much for all of your videos, I really appreciate it!!!! They have helped me so much, I wish you were my instructor!

  • @annali9577
    @annali9577 3 роки тому

    Am I the only one who's watching it 2021? Must be hard to learn without you, appreciate!

  • @kcorr94
    @kcorr94 11 років тому

    you are a God among shadows, youch in >30 min what my teachers fail to do in two months

  • @whostalha_
    @whostalha_ 2 роки тому +1

    watching this video in 2022 and still its amazingly effective

  • @jbstatistics
    @jbstatistics  11 років тому +1

    I do very much appreciate the compliment, but trust me on this, the American education system produces thousands upon thousands of people far more eloquent than I could ever hope to be. I am sure there are many wonderful educators in every country, just as there are some problems with the education system in every country. My roots growing up in a small Northern Ontario town slip through sometimes -- listen carefully for the "we're gunna..." that shows up in some of my videos :)

  • @reddmst
    @reddmst 3 місяці тому

    Good explanation of the basics. Wish you did a sequel with more elaborate examples :) The problem I'm facing now could be reduced to "given a k-sided die, what is the probability of getting each possible value at least once within n throws" and there's no video on youtube that would help tackle this.

  • @yaldafazlalizadeh2307
    @yaldafazlalizadeh2307 2 роки тому

    Thanks for great explanation of Multinomial distribution

  • @RavinduW
    @RavinduW Рік тому

    It is very clear to understand. Thank you for your video.🥰

  • @lenishpandey192
    @lenishpandey192 9 місяців тому

    Yo, I am going to follow all of your videos on statistics along with the book on stats I am currently reading. Thankyou,, really really helpful.

  • @waka853
    @waka853 5 років тому +1

    Thank you for your nice explanation. from Japan.
    分かりやすい動画をありがとう!

  • @yokmancing2908
    @yokmancing2908 11 років тому +1

    great explanation.from upm,malaysia.thanks a lot.

  • @jbstatistics
    @jbstatistics  12 років тому

    You are welcome!

  • @yougos1178bc
    @yougos1178bc 7 років тому +1

    Great work man. I use your videos as an introduction before studying, helps a bunch ! Thank you.

    • @jbstatistics
      @jbstatistics  7 років тому +1

      You are very welcome! I'm glad to be of help!

  • @sandorszabo2470
    @sandorszabo2470 4 роки тому

    Clear explanation. Very useful.

  • @g7_michaelafiajurilla742
    @g7_michaelafiajurilla742 3 роки тому +1

    Still here in 2021! (Need this for online classes) Thank you very much!

    • @jbstatistics
      @jbstatistics  3 роки тому +1

      I'm still here too! You are very welcome.

  • @thisnamewastaken413
    @thisnamewastaken413 3 роки тому

    Very helpful, thank you! You have a good voice for teaching :)

  • @xyzant5069
    @xyzant5069 4 роки тому +1

    i like your voice very much. it helps drawing my attention

  • @jbstatistics
    @jbstatistics  11 років тому

    Thanks! I get a lot of views from Malaysia. You folks must know good videos when you see them :) Cheers from Canada.

  • @liaoshun944
    @liaoshun944 9 років тому

    Pretty clear explanation ! Very helpful

  • @vasylarsenii4800
    @vasylarsenii4800 3 роки тому

    Very clear explanation!

  • @jalepezo
    @jalepezo Рік тому

    In R use the dmultinom function for your observations and the vector of probabilities

  • @cristinalaodenio6178
    @cristinalaodenio6178 Місяць тому

    Can you please explain how you get the answer at the part without replacement. I dont get it

  • @takudzwamarowa5941
    @takudzwamarowa5941 3 роки тому

    Well explained.... Thank you

  • @riyazahmed7907
    @riyazahmed7907 4 роки тому

    At 8:23 how come the denominators of 20 don't change because of the "with replacement" nature of this question.

    • @riyazahmed7907
      @riyazahmed7907 4 роки тому

      Oh soz nvm i got confused with "with replacement" and "without replacement".

  • @tpthpt5973
    @tpthpt5973 Місяць тому

    Amazing video!

  • @Terrariagrossmeister
    @Terrariagrossmeister Місяць тому

    Is there an efficient way to calculate something akin to:
    An urn contains 8 red balls, 3 yellow balls, and 9 white balls. 6 Balls are randomly selected **with replacement**.
    What is the probability that at least 2 are red and at least 1 is yellow?

    • @jbstatistics
      @jbstatistics  Місяць тому

      It depends what you mean by "efficient" :) There's no quick formula or anything like that; you have to add the probabilities of all the cases that satisfy the condition. So it can get a little complicated and messy.

    • @Terrariagrossmeister
      @Terrariagrossmeister Місяць тому

      @jbstatistics Ahhh sad...
      I have a similar case I wanted to calculate but that has 8 relevant outcomes within hundreds of thousands of trials...
      So adding all the cases up that satisfy the condition doesn't seem to be an option

  • @chaudharikavan8423
    @chaudharikavan8423 4 роки тому

    very good explanation

  • @ozshabat
    @ozshabat 10 років тому

    Awsome!
    very clear!
    you should do your own talk show

  • @crepeaunutellacrepeaunutel1811
    @crepeaunutellacrepeaunutel1811 9 років тому +1

    hello. what the difference between multinomial distribution witout replacement and the hypergeometric. they seems identical

  • @azisagantal320
    @azisagantal320 4 роки тому

    thank you for your hard work sir

  • @vincec2426
    @vincec2426 10 років тому

    Thanks you. This is and excellently produced tutorial.

  • @jbstatistics
    @jbstatistics  11 років тому

    Thanks! I'm glad to be of help.

  • @sekhar018
    @sekhar018 2 роки тому

    Good job.it's great 👍

  • @samtech2003
    @samtech2003 2 роки тому

    how can we solve the without relacement calculation(0.18204)

  • @faracanthaceae4946
    @faracanthaceae4946 8 років тому

    10:40
    is multinomial distribution without replacement similar to hypergeometric distribution?

  • @michaelliu6323
    @michaelliu6323 2 роки тому

    Thank you so much! This video is very helpful!

  • @rajeshwarsingha8211
    @rajeshwarsingha8211 5 років тому

    Outstanding Sir!!

  • @perceptrongaming4290
    @perceptrongaming4290 5 років тому

    have u done video on exponential distribution

  • @datascience-vf9kx
    @datascience-vf9kx 3 роки тому

    It was very useful. can i share this video on my Instagram posts??

  • @domonkos02
    @domonkos02 11 років тому

    Hey, I was wondering what possible problems can it casuse if a multinomial variable is recoded into binomial and analyzed as such. I think that the only difference is the research question one answers: if kept in a multinomial form, you can find out what the probability is that an individual with certain characteristics chooses option A, B, C. If the variable is recoded, then you can only find out what the probability is that this person prefers option A over all non-A options. Is that correct?

  • @wesleysbr
    @wesleysbr 8 років тому

    Why haven't you not elavated the (8 choose 2) by 2, in the 10:28 ?

    • @jbstatistics
      @jbstatistics  8 років тому

      For that part of the calculation, we want the number of ways of picking 2 red balls from 8, which is what (8 choose 2) represents. The square of (8 choose 2) is not a useful number for us in that question.

  • @sau002
    @sau002 4 роки тому

    Nicely explained

  • @aakashkatiyar5005
    @aakashkatiyar5005 3 роки тому

    Liked, subscribed and shared!

  • @mando1964
    @mando1964 Рік тому

    Great explanation. But what I can't grasp my head around is why we need the "different ways to arrange them" Lets say we have the blood example. and we have a sample of 5 people. and 3 of them are O, 2 are A. How is OOOAA not the same as AAOOO, if nowhere in the problem it stays there is an specific order to take into consideration when taking the sample?

    • @jbstatistics
      @jbstatistics  Рік тому +2

      It is the same as far as we are concerned, and we don't care about the order. We simply want to know the probability of a certain number of occurrences of each of the k outcomes. But that's precisely why we need that multinomial coefficient in front. p_1^x_1...p_k^x_k is the probability of getting *any specific ordering* of x_1 occurrences of outcomes 1, x_2 occurrences of outcomes 2, etc. That's what that term gives us. We don't care about that in isolation though, so we need to add up the probabilities of all the orderings that get us x_1 occurrences of outcomes 1, x_2 occurrences of outcomes 2, etc. That's what multiplying to the multinomial coefficient does for us.
      It's the same logic as for the binomial distribution. The binomial is a slightly simpler situation, and I go into this notion in a little more detail in my binomial video. (If you don't understand what I"m talking about above, you might find the binomial video helpful.)

    • @mando1964
      @mando1964 Рік тому

      @@jbstatistics thank you! so my understanding is that for either the binomial or multinomial distributions, it is implied that the events are in succession. Like, in the blood example, we are asking one person, then another, then another. and thats why the order OOOAA and AAOOO are considered different events?

    • @jbstatistics
      @jbstatistics  Рік тому +1

      @@mando1964 It's just a way to visualize it. We could ask them all at exactly the same time, and have them all respond at exactly the same time, and that would not change the situation.
      Toss 2 fair coins at the same time. What's the probability heads comes up exactly once? 1/2, and there's a variety of ways to come up with that, with one of them being:
      In a conceptual world, where a magic fairy were to glance at the coins, there are 4 outcomes: TT, TH, HT, HH, where the first letter represents the outcome on the coin that the fairy happened to see first. Each of those 4 outcomes has a probability of (1/2)(1/2) = 1/4. But we care not about the order, so TH or HT would get us what we need in our world. Thus 2 of those outcomes in that magical fairly world give us heads exactly once, with each of those outcomes having a probability of 1/4, and thus the probability of getting heads exactly once is 2(1/4) = 1/2.

    • @mando1964
      @mando1964 Рік тому

      @@jbstatistics This last paragraph made it clear to me. It reiterated that it all comes down to the many ways HT can occurs. Hence why it has a bigger probability than HH or TT. Like why rolling 2 dice, the probability of getting a 7 is way higher, because there are more ways it can occur, correct?

    • @jbstatistics
      @jbstatistics  Рік тому +1

      @@mando1964 Yes, it's related to that idea. We only care about getting the 7, but there are many ways of getting 7, so we need to add up the probabilities of all those different ways.

  • @mandrinnd10
    @mandrinnd10 10 років тому

    This has helped me memorize concepts for exam p. Do you have a video for convolution, I can't find?

  • @rimjhimmishra6337
    @rimjhimmishra6337 8 місяців тому +2

    Thank you for this!

  • @ayeshaadhikari6123
    @ayeshaadhikari6123 4 роки тому

    thank you so much! it's really helpful ❤️

  • @Dupamine
    @Dupamine 9 місяців тому +1

    I think it should be random variable Xk instead of Xi because k represents outcomes

    • @jbstatistics
      @jbstatistics  9 місяців тому +1

      There are k random variables: X_i, for i = 1, ..., k. X_k is the kth random variable.

  • @arpithams
    @arpithams 11 років тому

    do we use multinomial distribution with random vectors ?? As i understood random vector is the collection of random variables ? is this correct ?

  • @KeremBJK96
    @KeremBJK96 6 років тому

    best stat channel on youtube?

    • @jbstatistics
      @jbstatistics  6 років тому

      Absolutely! Why the question mark? :)

  • @markanthonyoccena7345
    @markanthonyoccena7345 3 роки тому

    thank you for this, really helpful

  • @cliffordino
    @cliffordino 4 роки тому

    Very helpful. Thanks.

  • @ohnoyo
    @ohnoyo 8 років тому

    im still confused as to how the probability is calculated at 6:33

    • @uhuihiuhuihi
      @uhuihiuhuihi 8 років тому

      10!/6!2!1!1! By definition is the same as 10!/(6×5×4×3×2×1×2×1×1×1) since 6!=6×5×4×3×2×1 etc. Multiply that by 0.44^6 × 0.42^2 × 0.1 × 0.04 and you will get 0.012902538 which is the answer.

    • @FloppyDobbys
      @FloppyDobbys 7 років тому

      The probabilities are independent. Remember that if each trial is independent of the other, such as a coin flip then ---> P(a, b) = P(a) * P(b); Thats Why you see the right hand side of the equation. The left hand side comes straight out of Probability theory where we can get the Number of different ways this event can occur as the (Number of permutations if we considered every event distinct) / (the number of ways we can shuffle around the similar events such as the A's)

  • @srdjan780
    @srdjan780 9 років тому

    at 3:36 you mean up to k, not up to n?

    • @jbstatistics
      @jbstatistics  9 років тому

      +Srx 30 No, I meant n. Each of the individual random variables can take on any whole numbered value between 0 and n, subject to the constraint that they all must sum to n.

  • @AndersSildnes
    @AndersSildnes 11 років тому +3

    Great video

    • @jbstatistics
      @jbstatistics  11 років тому +1

      Thanks Anders! I'm glad to be of help.

  • @jubeidono
    @jubeidono 5 років тому +1

    Great video thank you for the explanation! I was wondering, wouldn't it be easier to follow if you called the variables Xred, Xyellow and Xwhite instead of X1 X2 and X3?

  • @zeio-nara
    @zeio-nara 5 років тому

    Thank you a lot, perfectly explained

  • @namhoang353
    @namhoang353 10 років тому

    Dear Mr jbstatistics! Thank for your presentation. I have a problem with Multinomial Naive Bayes. I can't fully understand the meaning one of the fragment in the formula of the probability of a document in Multinomial Naive Bayes Model.
    P(di|Cj) = P(|di|). |di|!. U(P(Wt|Cj)^Nit / Nit !) with i = 1, .., |V|. U is Integration, comment isn't allowed for special symbol so I can't express it.
    My question:
    P(|di|), what does this probability mean? How to compute it?
    Please explain for me! Thanks you so much.
    Best regard!
    Nam.

  • @name2964
    @name2964 2 роки тому +1

    Thanmks

  • @hamzamohd.zubair1709
    @hamzamohd.zubair1709 2 роки тому

    can we say for multinomial distributions all possibilites should be 1. MECE and 2. IID

  • @raivatshah7847
    @raivatshah7847 7 років тому

    Awesome! Thank you so so much +jbstatistics! You're truly an amazing statistician!

    • @jbstatistics
      @jbstatistics  7 років тому +1

      I'm not so sure about the "amazing statistician" part, but I do my best and I'm glad to be of help!

  • @pedrofernandosalgadoalvare772
    @pedrofernandosalgadoalvare772 4 роки тому

    Thank you so much!!!!! you're wonderful!

  • @muneeburrehman3624
    @muneeburrehman3624 8 років тому

    good explanation

  • @frederikandresen7378
    @frederikandresen7378 7 років тому

    Lifesaving videos

  • @jfru5trate11
    @jfru5trate11 11 років тому

    awesome videos, keep them going :DD

  • @seungkwonbeack3828
    @seungkwonbeack3828 6 років тому

    Really Great!

  • @Danieldrd
    @Danieldrd 3 роки тому

    Thank you so so much.

  • @deepnme1
    @deepnme1 7 років тому

    n! / x1! .....Xk! , no of ordering that give us x_1 occurences of outcome 1 and so on. It is nCx right? So where is (n-x)! in the denominator? Thank you

    • @jbstatistics
      @jbstatistics  7 років тому

      It is a generalization of nCx beyond just two groups of items (successes and failures) to k groups of items (group 1, group 2, ..., group k). With just two groups, it reduces to n!/(x!(n-x)!)

  • @MintyParasol
    @MintyParasol 4 роки тому

    is this for grade 9?

  • @miyamotomusashi4556
    @miyamotomusashi4556 Рік тому

    Thank you ❤

  • @m7mdarwani964
    @m7mdarwani964 3 роки тому +1

    Salute.

  • @rushi_fit
    @rushi_fit 5 років тому

    Thank you sir !

  • @MH-zv2tc
    @MH-zv2tc 2 роки тому +1

    🔥🔥🔥🔥😍🤩🤩

  • @vinayak186f3
    @vinayak186f3 3 роки тому

    Thanks 😀

  • @rajershigpt
    @rajershigpt 5 років тому

    nice, very nice

  • @mamado226
    @mamado226 5 років тому

    @jbstatistcs magnificent work, arguably the best explanation of probability distributions in UA-cam, an evergreen tutorial! Many thanks sir

  • @kurtissac
    @kurtissac 3 роки тому

    thanks