How does Netflix recommend movies? Matrix Factorization

Поділитися
Вставка
  • Опубліковано 29 лис 2024

КОМЕНТАРІ • 332

  • @Xraid32
    @Xraid32 6 років тому +301

    Sharknado = Twister + Jaws. This was gold. That was the moment all of Machine Learning made sense.

    • @ianboard544
      @ianboard544 4 роки тому +8

      It sounds like a pitch you might make to a production executive.

    • @AmeyPanchpor02
      @AmeyPanchpor02 4 роки тому +3

      Very true

  • @computerguycj1
    @computerguycj1 5 років тому +89

    Sir, I've seen almost all of these concepts painfully "explained" in many different ways, but never have I seen them presented as elegantly and intuitively! Excellent video!

  • @conintava514
    @conintava514 6 років тому +127

    So informative and easy to follow. I love this. Thank you so much for taking the time to create this video. It's so important to know how the concepts we learn in class can be applied in real life. This has changed everything for me. Thank you again.

  • @shivanshkaushik383
    @shivanshkaushik383 2 роки тому +5

    This is a work of art. Never thought matrix factorization could be explained so effortlessly yet so clearly. You have helped me a lot with this sir! Thank You, God bless you!

  • @anushkagupta79
    @anushkagupta79 3 роки тому +6

    I read so many articles about this topic but was never able to understand. You made it all so easy. Excellent work!!!

  • @reyhanehhashempour8522
    @reyhanehhashempour8522 6 років тому +7

    Luis! You are a fantastic teacher! Not everyone can explain complicated concepts in a way that every body understands. Your teaching style shows the depth of your knowledge! Thank you!

  • @AG-dt7we
    @AG-dt7we 26 днів тому

    I’ve heard the saying, 'You don’t really understand something unless you can explain it to your grandmother.' Watching this, I can totally relate this. Amazing explanation !

  • @atulitraopervatneni9320
    @atulitraopervatneni9320 6 років тому +24

    You are one of best teachers on UA-cam. Thanks

  • @killuawang677
    @killuawang677 4 роки тому +1

    This video deserve 10x more likes. I got to say it is so much better than Google's own recommendation system crash course...

  • @ws-ob4wy
    @ws-ob4wy 6 років тому +9

    I find your teaching method not only to be great but also very valuable to motivate young people to take up Machine Learning. You could make it even better by also relating it to the math (Linear Algebra, Calculus, Probability) in a more familiar form. Make sure that anyone teaching and learning ML in a college environment will be aware of your videos. Great stuff.

  • @TheGenerationGapPodcast
    @TheGenerationGapPodcast 3 роки тому

    Most of us who have been watching your videos are changed forever. We are convinced now that there are better ways to teach machine learning and your way is one of the better ways. Thanks

  • @crazywebhacker9769
    @crazywebhacker9769 4 роки тому

    There are not many really good ML videos on YT. This is by far one of the best.

  • @ruchitchudasama1407
    @ruchitchudasama1407 3 роки тому

    This video blew my mind. I never imagined that the matrix multiplication that we learnt in high school could find such a huge application.

  • @glencheckisthename
    @glencheckisthename Рік тому

    I searched about 20 videos and blogs, this is the best explanation about FM

  • @kevdaag2523
    @kevdaag2523 6 років тому +9

    That was great the way you explained matric factorization and then turned it into an explanation of ML.

  • @penguinmonk7661
    @penguinmonk7661 Рік тому

    If you are areading this, you perhaps found the missing link in your ML knowledge, I sure as heck know I have, so don't pass up on it. Watch at least the first 15min.
    Praise:
    I am a CS Academic chair with a specilization in security and Distributed systems.
    Never have my peers in Machine learning/ AI truly explained to me why this works. I just knew it had to do with mathematics, and I knew how to use the software modules. I could implement them line by line and turn math found on wikipedia or textbooks into code. I have even been part of a research effort into self learning robots using hyperNEAT.
    From the bottom of my heart: THANK YOU.
    I finally understand how this works. Its been 3 weeks now and this entire fields has been opened up for me, I look at it with such different eyes and so much more appreciation and wonder.
    THANK YOU.

  • @AmeyPanchpor02
    @AmeyPanchpor02 4 роки тому +4

    Really this is one of the best introductory video i have found. Knowledge + simple understanding examples = Gives very good understanding of topic.

  • @Vatn7
    @Vatn7 9 місяців тому

    By far, one of the best videos on Matrix Factorization! I was looking for a good explanation on this and instantly clicked on this video as soon as I saw it was from Luis. Luis, you are a fantastic teacher!

  • @SajjadZangiabadi
    @SajjadZangiabadi Рік тому +1

    The instructor does an excellent job of breaking down concepts and explaining them step by step in a way that is easy to understand. I appreciate the time and effort put into creating such an informative and well-presented video. Thank you for sharing your knowledge with us.

  • @fernandobezerra4040
    @fernandobezerra4040 4 роки тому +1

    THE BEST VIDEO ABOUT MATRIX FACTORIZATION EVER! CONGRATULATIONS, TEACHER!

  • @hcordioli
    @hcordioli 3 роки тому

    Luis, this is the best explanation I´ve ever seen, not only about Recommendation Systems, but basic concepts like Gradient Descent, Loss Function, Matrix Factorization, etc. Congartulations for your didatic way, and thanks for sharing !

  • @jonlenescastro1662
    @jonlenescastro1662 3 роки тому

    Definitively the best explanation on YT

  • @andis9076
    @andis9076 Рік тому

    Man, YOU'RE GOOD ! I rarely see a video that explain things so clearly like yours !

  • @samarthpianoposts8903
    @samarthpianoposts8903 2 роки тому

    Since the video is over 30 min long, let me break it up
    00:40 How do recommendations work (Netflix example)
    07:35 How to figure out dependencies (Matrix Factorization)
    16:03 Matrix Factorization Benefits
    20:38 How to find the right factorization
    26:35 Error Function for factorization
    30:14 How to use the factors to predict ratings (Inference)
    Really informative and comprehensible. I was wondering what is the difference between collaborative filtering and the Deep Learning recommendation algorithms. Now I understand that DL is one of the ways to perform the factorization for the collaborative filtering method.

    • @SerranoAcademy
      @SerranoAcademy  2 роки тому +1

      Thank you, that's very helpful! Added the timings to the video.

  • @premkumarpathare
    @premkumarpathare Рік тому

    One of the best explanation about matrix factorisation. Once understand you can't forget.

  • @SurajAdhikari
    @SurajAdhikari 4 роки тому +2

    Thanks Luis. This is one of the first videos I watched on Matrix Factorization and I understood them really well. Great job. Keep posting.

  • @karannchew2534
    @karannchew2534 Рік тому +1

    Hi Serrano, A suggestion please.
    Before walking through a detailed example, please first introduce the overall concept/algorithm/intuition, and, the content/agenda. First tell the learner what they would expect to see/learn, then start teaching them. Thanks for all the useful videos!

  • @sumitchhabra2419
    @sumitchhabra2419 3 роки тому

    I haven't come across any video on internet with such an intuitive explanation. Loved it!!!

  • @MohitJaggi-f8h
    @MohitJaggi-f8h 2 місяці тому

    Nicely explained. Small nit: you say square to avoid ambiguity between positive or negative which is a misleading simplification. The reason to do that is to avoid the errors from canceling each other out when you add them up for all ratings. That is indeed the step you show next so easy to add an accurate explanation

  • @ShimmerCloudz
    @ShimmerCloudz 5 років тому +2

    Tutor gave a clear understanding of matrix factorization. Also, even though this lecture was not about hyper parameters and gradient descent, but first time I got clear understanding of these two concepts.

  • @sarthaktiwari1889
    @sarthaktiwari1889 4 роки тому +1

    It is one thing to have a great hold over technical concepts and another thing to be able to explain them. You have both. Very well explained!!!

  • @amanzholdaribay9871
    @amanzholdaribay9871 5 років тому +1

    WooooW! That has been as simple as possible! If person understands something, he can be able to explain it even to the child - I mean level of understanding is amazing! Thank you!

  • @mingman753
    @mingman753 3 роки тому

    OMG this is wonderful. My mother tongue is not English but this lecture is much better-understanding than others in my language. I logged in to 'like' this video. Thank you so much for your video!

  • @UmeshRajSatyal
    @UmeshRajSatyal 5 років тому +7

    Well explained and very easy to understand. Stopping here to thank you.

  • @shelllu6888
    @shelllu6888 3 роки тому

    honestly the best video I've seen to explain matrix factorization. Thank you so much!

  • @vishalmendekar7006
    @vishalmendekar7006 5 років тому

    One of the best video explaination i found till now. Everything is crystal clear with real examples. Thanks alot for posting the video

  • @spikeydude114
    @spikeydude114 2 роки тому +1

    You really did a great job of distilling what I saw as a complex topic to something practical and understandable. Great video!

  • @Josh-di2ig
    @Josh-di2ig 2 роки тому

    By far the best ML teacher ever. Thanks for a great vid!

  • @vinaysingh6664
    @vinaysingh6664 5 років тому +2

    I really like the way you explained this concept in so simple words. At best what we do at deep levels comes from what we learn at the basics and clearing those is the most important thing, and I guess you took really good care of that. Again Thank you for this great resource. :)

  • @hasush
    @hasush 3 роки тому

    2 hours spent trying to understand MF factorization and SVD++ in Wikipedia to no avail ... 30 minutes here and super clear... awesome thank u!
    One question, how to deal with new users, movies, and ratings? Retraining? What other solution to avoid retraining?

  • @in-my-opinion6423
    @in-my-opinion6423 2 роки тому

    Awesome. Nowhere would one see such a clear explanation

  • @azurewang
    @azurewang 2 роки тому

    watched again after 3 years, still be amazed!

  • @gobbledee55
    @gobbledee55 9 місяців тому

    Wow... you did an outstanding job of explaining this topic. Thank you for this. It was very clear, concise, and the graphics were spot on and helped visually everything. Visual learners are all thankful for this presentation :D

  • @nnslife
    @nnslife 4 роки тому +1

    Great video!
    I rarely assign this kind of title to a video, but this was really great: easy and detailed at the same time!
    Once you put matrices like at 13:23, I was like: wow, this is how matrix multiplication should be introduced in colleges!
    Even many years later and with a good understanding of linear algebra, this adds so much intuition.

  • @ultraviolenc3
    @ultraviolenc3 4 роки тому +1

    Great video! So much easier now to comprehend more complicated material after your explanation

  • @dayan5402
    @dayan5402 4 роки тому +2

    Real-life application + theory in simple terms. Very nice! Thank you!

  • @jackshi7613
    @jackshi7613 2 роки тому +1

    Well explained concepts, really appreciate your nice video

  • @killuawang677
    @killuawang677 4 роки тому +2

    Actually I do still have a question after this video: How do we know how many features we are supposed to have? i.e. how were you able to decide the factorized metrices are 2 x N and M x 2? Does it mean you might end up getting a feature that is a combination of multiple "actual features" and you need to further break it down?

    • @samarthpianoposts8903
      @samarthpianoposts8903 2 роки тому +1

      That is something which people experiment with by seeing what gives them the best result. In general, people experiment with values proportional to the logarithm of number of unique items.

  • @nguyenkimtrang9525
    @nguyenkimtrang9525 3 роки тому

    30 minutes gold ~ the best explaination ever! Respect! Many thanks to you!

  • @ZavierBanerjea
    @ZavierBanerjea 4 місяці тому

    As always a big fan of Luis! He is a master of "Explain this concept to a kid" Idea. Of course, that is what Greatness is!

  • @黃煜棋-f3h
    @黃煜棋-f3h 4 роки тому

    It is so easy to understand such a difficult concept. You must be a great teacher. I like this kind of video very very much. Thanks a lot.

  • @nikhilbelure
    @nikhilbelure 5 років тому +2

    elegantly explained. like the description very friendly introduction. I was struggling to see how matrix factorization plays role in recommendation system. no I got it
    Thanks

  • @phaniramsayapanen5890
    @phaniramsayapanen5890 3 роки тому +2

    Great explanation, you seem to understand the concept very clearly. Subscribed immediately! any videos on expectation maximization, svd, dimensionality reduction ? or resources that you liked most ?

  • @mohitaggarwal6220
    @mohitaggarwal6220 Рік тому

    The explanation for gradient descent was great, but I'm a little confused about the 25:00 minute part. In the matrix, the (1,1) element is 1.44, but the actual value is 3. So, we need to increase something. It could be [f1][m1], [f2][m1], [A][f1], or [B][f2]. How do we decide which one to increase? And by increasing which value and by what factor can we get accurate results? Increasing a single value or multiple values can potentially bring us closer to the answer. If anyone has an answer for this doubt, please clarify. I'm curious to know.

  • @youngzproduction7498
    @youngzproduction7498 3 роки тому

    I must say thanks for your effort. This vid literally saves my day.

  • @guanyanlin1933
    @guanyanlin1933 5 років тому

    No doubt. It is definitely an excellent tutorial, and give a reasonable answer of why the weights in the hidden layer is the embedding of a movie or a person. Thanks a lot.

  • @chandanroy1789
    @chandanroy1789 3 роки тому +1

    Great explanation! I was looking for something cool and simple to refresh my past learnings.

  • @shamim-io
    @shamim-io 5 років тому +2

    Sir you are truly a great teacher. Such a beautiful presentation. U made the concept so simple. Very much grateful to you. Please keep making videos.. Love from india !!

  • @scherwinn
    @scherwinn 5 років тому

    Excellent way to show Gradient Descent and error function.

  • @derekhe6816
    @derekhe6816 4 роки тому

    Thank you so much for this video. Your teaching style is great and you presented all the information comprehensively but simply that I feel like I have a much greater grasp of the concepts.
    Here are 2 suggestions: perhaps invest in a microphone that gives you clearer sound, because currently I have to turn the headset way up and the levels are too high that it can kind of hurt. Also, if you could spend more time at the end writing out general formulae of the algorithm like Andrew Ng, that would be nice.
    Once again, thank you so much for this video!

  • @andykim1614
    @andykim1614 3 роки тому +1

    Not even going to lie, very hard NOT to follow. Thank you for the explanations!

  • @mkamp
    @mkamp 6 років тому +2

    Absolutely wonderful. Thanks for taking the time to slow walk us through it.

  • @kamalamarepalli1165
    @kamalamarepalli1165 4 роки тому

    What a visual treat to understand the logic and concept behind....soo good and very well explained.

  • @ASHISHDHIMAN1610
    @ASHISHDHIMAN1610 4 роки тому

    Hey sorry for knit-picking but at 17:01, the red triangle would have transposed shape i.e. greater height(2000 users) than width (1000 movies) !! Great video though !! Please make one on Gaussian Mixture models.

  • @krishnaKumar-zi6ct
    @krishnaKumar-zi6ct 4 роки тому

    Superb presentation! u have simplified and explained the concept so well...clear flow, great visuals. Thank you very much Luis!!

  • @rigoluna1491
    @rigoluna1491 4 роки тому

    By far the easiest thing to follow, thanks

  • @nikhithasagarreddy
    @nikhithasagarreddy 4 роки тому +1

    Super sirr, every class is very clear ,, there are only few classes available. Please upload every class sir,,😘😘😘

  • @guitar300k
    @guitar300k 2 роки тому +1

    in your example, we have two latent factors, so how do we know which one should increase which one should decrease to reduce the error, it seem like you have to increase/decrease both of them at same time

  • @renemartinez3014
    @renemartinez3014 3 роки тому

    Excellent video. A little bit slow pace but thanks to it there´s little space for doubts or misunderstanding. Great job.

  • @nurkleblurker2482
    @nurkleblurker2482 3 роки тому +1

    Great video. But how do you determine a users preferences for movies in the first place?

  • @sargun.nagpal
    @sargun.nagpal 3 роки тому

    Given a new movie M6, how do we assign feature values for the movie?
    Related question is- given a new person E, how do we come up with their interest in different features?

  • @iamdanielkip
    @iamdanielkip 5 років тому +1

    I was driven here after reading a chapter on RGA's book where the mention "collaborative filtering". I was curious and decided to learn more about it. I would like to know though, what computer language is generally used to achieve this? Thank you for the very simple and fun explanation.

  • @sidagarwal43
    @sidagarwal43 4 роки тому

    Very clear and lucid explanation. Thanks

  • @tejaswi1995
    @tejaswi1995 2 роки тому

    Wow. Great content. Latent features concept got so clear after watching this!

  • @CDALearningHub
    @CDALearningHub 4 роки тому

    Thanks. Nicely explained with visuals to understand matrix factorization.

  • @ChetanRawattunein
    @ChetanRawattunein 4 роки тому +1

    Thank you UA-cam recommender for the video🤗. I was really looking for something this informative.

  • @codingpineappl3480
    @codingpineappl3480 2 роки тому

    Best video, you can find about matrix factorization. Thanks a lot

  • @naffiahanger9316
    @naffiahanger9316 4 роки тому

    Best explanation of matrix factorization.

  • @mkarthikswamy
    @mkarthikswamy 6 років тому

    Brilliant work Luis, I understand Matrix Factorization well enough now!

  • @osmanovitch7710
    @osmanovitch7710 2 роки тому

    teacher you are a legend , thank you so much

  • @TejasPatil-fz6bo
    @TejasPatil-fz6bo 3 роки тому

    This video made my day...thanks Prof. Luis

  • @vulkanosaure
    @vulkanosaure 4 роки тому

    Thanks so much, it's extremely well explained, better than other things I saw on this topic. The stucture of the NN that solves this is clear in my mind noW

  • @msnjulabs
    @msnjulabs Рік тому

    How did we figure out what columns should be in the Factorized matrices???. Also how did we figure out how many factors should be in the resultant matrices?? Also how did we ident those columns? Say columns to be comedy and action for user?

    • @SerranoAcademy
      @SerranoAcademy  Рік тому +1

      Great question! The model does it automatically in the training process. In this example I created two columns for exposition, but in reality, sometimes they mean something (features), but sometimes they're combinations of features, or things that we can't identify, but that the model picks up.

  • @raphaeldayan
    @raphaeldayan 5 років тому +1

    PERFECT VIDEO! YOU ARE THE BEST! So easy to follow, so clear, thank you

  • @nguyenhiep6639
    @nguyenhiep6639 3 роки тому

    Thanks for your full explain inaction. It helps me really much to understand my project

  • @bayesml
    @bayesml 4 роки тому +1

    Thank you for the video. I just wonder if I need a matrix with numbers all filled up for the training and I test the trained model on some sparse test matrix?

  • @shaikhmosakib604
    @shaikhmosakib604 3 роки тому

    What A Explanation Dude, Thank You So Much

  • @zonghengpu2235
    @zonghengpu2235 5 років тому +1

    shark + tornado = sharknado
    what a great lesson!

  • @sajadkarim
    @sajadkarim 4 роки тому

    Many thanks for the video. It was really helpful and the way you explained the concept is outstanding. 101/100!

  • @carlitos5336
    @carlitos5336 3 роки тому

    THANK YOU. Best explanation ever.

  • @bradhammond5581
    @bradhammond5581 2 роки тому

    Great video, you broke down ML into easy-to-understand terms. Great job!

  • @shahnawazhussain7506
    @shahnawazhussain7506 5 місяців тому

    WOW. How simply explain it. Great Video.

  • @peaceandlov
    @peaceandlov 3 роки тому +1

    Best video ever. Thanks mate.

  • @mariannelaffoon2598
    @mariannelaffoon2598 2 роки тому

    Thinks a lot! It's really useful and interesting.

  • @obesechicken13
    @obesechicken13 4 роки тому

    One issue I see right away is that this solution doesn't scale. Netflix for instance has way more movies and way more users than in this table and the number of calculations increases exponentially as those 2 dependencies increases.

  • @HimanshuRobo
    @HimanshuRobo 5 років тому

    How will you decide the number of features in general? There will be a technique to identify the optimum number of features? Can you suggest some of the algorithms?

  • @abhishekrupakula1613
    @abhishekrupakula1613 3 роки тому

    Thank you so much Luis. Very well explained.

  • @arbenkqiku8880
    @arbenkqiku8880 6 років тому +1

    Hi Luis, great video! What is the difference between matrix factorization and non-negative matrix factorization? Does Netflix use non-negative matrix factorization or is it another method?

  • @c0t556
    @c0t556 5 років тому +2

    Can you get overfitting with matrix factorization?

  • @petersilie2950
    @petersilie2950 3 роки тому

    This video is so good to understand machine learning! Thank you :)