Matt Yedlin
Matt Yedlin
  • 159
  • 175 861
Dancing with AI - A Visual Bridge from Dance to ChatGPT
How does ChatGPT work?
In this video Dr. Matthew Yedlin (Associate Professor in Electrical & Computer Engineering, UBC) and Armin Saadat (MASc Student, Applied Science at UBC) demonstrate how the pattern matching algorithms of large language models (LLM), such as ChatGPT, can be visualized through dance.
Переглядів: 312

Відео

2022 03 01 09 40 36 1
Переглядів 3732 роки тому
2022 03 01 09 40 36 1
Time Series Analysis with RNNS: The Model Problem Revisited
Переглядів 6084 роки тому
This video revisits the model problem within the context of RNNs. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Machine Learning: In-depth look at Long Short Term Memory
Переглядів 6974 роки тому
This video goes into an in-depth look at Long Short Term Memory Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Introduction to Gated Recurrent Units
Переглядів 2,8 тис.4 роки тому
This video introduces Gated Recurrent Units. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Machine Learning: Introduction to Long-Short Term Memory (LSTM)
Переглядів 5804 роки тому
This video talks about Long-Short Term Memory Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Limitations of the basic RNN
Переглядів 5394 роки тому
This video talks about the limitations of the basic RNN Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
RNN's: How do they work?
Переглядів 4754 роки тому
This video talks about how RNNs work. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Introduction to RNNS
Переглядів 4794 роки тому
This video introduces RNNs Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Time-Series Analysis with RNNs: A model problem
Переглядів 3074 роки тому
This video talks about Time-Series Analysis with RNNs and introduces a model problem. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Basic Architecture of CNN
Переглядів 8654 роки тому
This video talks about the Basic Architecture of CNN Attribution-NonCommercial-ShareAlike CC BY-NC-SA Source Used: codelabs.developers.google.com/codelabs/cloud-tensorflow-mnist/index.html?index=../..index#9 Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
Detailed Architecture Replacement
Переглядів 2594 роки тому
This video talks about Detailed Architecture Replacement Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
Receptive Field: Hubel and Wiesel
Переглядів 1,4 тис.4 роки тому
This video talks about the Receptive Field by Hubel and Wiesal. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Source Used: codelabs.developers.google.com/codelabs/cloud-tensorflow-mnist/index.html?index=../..index#9 Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
High Level Introduction to CNN
Переглядів 4024 роки тому
This video is a high level introduction to CNN. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
Introduction to Hyperparameters
Переглядів 2594 роки тому
Introduction to Hyperparameters
Introduction to Weight Initialization
Переглядів 4184 роки тому
Introduction to Weight Initialization
Introduction to the Learning Rate
Переглядів 3304 роки тому
Introduction to the Learning Rate
Vanishing and Exploding Gradients
Переглядів 8784 роки тому
Vanishing and Exploding Gradients
Momentum: Its use in Gradient Descent
Переглядів 4384 роки тому
Momentum: Its use in Gradient Descent
Gradient Descents: Where are We and Next Steps.
Переглядів 2754 роки тому
Gradient Descents: Where are We and Next Steps.
Machine Learning: Where are We?
Переглядів 3994 роки тому
Machine Learning: Where are We?
Implementation of SGD
Переглядів 4624 роки тому
Implementation of SGD
Review of Gradient Application in Deep Learning
Переглядів 3084 роки тому
Review of Gradient Application in Deep Learning
Regularization in Nature - Nervous System
Переглядів 2564 роки тому
Regularization in Nature - Nervous System
Regularization - Data Augmentation
Переглядів 5034 роки тому
Regularization - Data Augmentation
Regularization - Dropout
Переглядів 4 тис.4 роки тому
Regularization - Dropout
Regularization L2, L1
Переглядів 3 тис.4 роки тому
Regularization L2, L1
Automatic Differentiation.
Переглядів 13 тис.4 роки тому
Automatic Differentiation.
Introduction to Regularization
Переглядів 6264 роки тому
Introduction to Regularization
Categorical Cross - Entropy Loss Softmax
Переглядів 16 тис.4 роки тому
Categorical Cross - Entropy Loss Softmax

КОМЕНТАРІ

  • @sherifffruitfly
    @sherifffruitfly 11 днів тому

    Ah so dual numbers are used to reduce calculus to linear algebra. Important because linear algebra is the only thing we've gotten computers to do well. Cool!

  • @David-kw5uj
    @David-kw5uj Місяць тому

    Fine, but which is the final result? My computations bring to: -2sin(2)-6cos(2) but I am not sure I got the method properly.

  • @deltax7159
    @deltax7159 2 місяці тому

    we share two things in common! I am a classically trained violinist and finishing my graduate degree in applied statistics with a special emphasis on ML and building neural networks! would love to chat you up sometime and play some music :)

  • @deltax7159
    @deltax7159 2 місяці тому

    such a great video. This was explained exceedingly well.

  • @mattyedlin7292
    @mattyedlin7292 3 місяці тому

    Thank you!

  • @user-ee3tv8wm5t
    @user-ee3tv8wm5t 3 місяці тому

  • @user-oy7ut2dc2o
    @user-oy7ut2dc2o 3 місяці тому

    are we taking the points randomly to work on?

    • @mattyedlin7292
      @mattyedlin7292 3 місяці тому

      We sweep backwards from the loss function- using the chain rule - sweeping backwards through the computational graph

  • @openknect3386
    @openknect3386 4 місяці тому

    ❤ this intuitive example video. Thanks so much.

  • @trajanobertrandlleraromero6579
    @trajanobertrandlleraromero6579 4 місяці тому

    Vine buscando cobre y encontré oro!!!!

  • @charuudhiman2372
    @charuudhiman2372 5 місяців тому

    Very nice explanation ✅👍

  • @striderQED
    @striderQED 7 місяців тому

    I watched one of your math videos and I thought - this guy has to play music. ! !

  • @striderQED
    @striderQED 7 місяців тому

    Dude, this is awesome. Thanks

  • @raymondchang9481
    @raymondchang9481 10 місяців тому

    how much is an intercontinental ballistic missle?

  • @MAX_TV99
    @MAX_TV99 10 місяців тому

    Thank youu❤

    • @mattyedlin7292
      @mattyedlin7292 10 місяців тому

      I’m glad you liked this video. Please remind me in class on Wednesday next to tell a story about this video. Thank you! ( Friday I will be recovering from eye surgery and Sam will take the class.

    • @MAX_TV99
      @MAX_TV99 10 місяців тому

      @@mattyedlin7292 ok sir ✌️

    • @MAX_TV99
      @MAX_TV99 10 місяців тому

      @@mattyedlin7292 get better soon sir 🙏🏽

    • @mattyedlin7292
      @mattyedlin7292 10 місяців тому

      @@MAX_TV99 Thank you. I am steadily recovering 😊

  • @WombRaidrr
    @WombRaidrr 11 місяців тому

    Not we’ll meet again but it’s nice

  • @a_thxna
    @a_thxna Рік тому

    This is absolutely beutiful

  • @Degenerac1ng
    @Degenerac1ng Рік тому

    cool

  • @izzzzzz6
    @izzzzzz6 Рік тому

    Welcome to Hogwarts Mr. Potter! lol

  • @veganath
    @veganath Рік тому

    Wow! Hats of to you guys, perfect in demystifying Categorical Cross-Entropy.... thank you!

  • @lucyfrye6723
    @lucyfrye6723 Рік тому

    It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.

    • @mattyedlin7292
      @mattyedlin7292 Рік тому

      Hello Lucy Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.

  • @adamradekmartinez1536
    @adamradekmartinez1536 Рік тому

    I come for the acting, not the academic content.

  • @runemacquoy7005
    @runemacquoy7005 Рік тому

    verry interesting

  • @ahmadbinali4668
    @ahmadbinali4668 Рік тому

    Love your method

  • @AnubhavApurva
    @AnubhavApurva Рік тому

    Thank you!

  • @jimbobur
    @jimbobur Рік тому

    *(EDIT: Solved it, see comment reply)* I don't follow how you go from the case of numerical example, where the likelihood is a product of predicted and observed probabilities p_i and q_i each raised to the number of times they occur, to the algebraic expression of the likelihood where you take the product of q_i raised to N * p_i (or is that N_p_i? I'm a little unsure if the p_i is a subscript of the N or multiplied by it).

    • @jimbobur
      @jimbobur Рік тому

      I worked it out. The answer is to remember that the number of times the outcome i, with probability p_i occurs can be expressed by rearranging the definition p_i = N_p_i / N and substituting this into the expression for the likelihood in the general form that follows from the numerical example: L = Π q_i ^ N_p_i , giving L = Π q_i ^ N*p_i

  • @awesomerpyt6594
    @awesomerpyt6594 Рік тому

    Hey, was wondering where did the speed number come from? 1/7*10^8 ?

  • @tosintibeju8797
    @tosintibeju8797 Рік тому

    Thank you very much for the simplicity.

  • @kartikgamer2.071
    @kartikgamer2.071 Рік тому

    Amazing

  • @slim590
    @slim590 Рік тому

    Thank you very much

  • @slim590
    @slim590 Рік тому

    This is exactly what Elon musk was talking about. Teachers need to make it fun and exciting like this video for student to leaen

  • @versus5341
    @versus5341 Рік тому

    Rest In Peace legend 😊

    • @gamin4543
      @gamin4543 Рік тому

      what happened

    • @versus5341
      @versus5341 Рік тому

      @@gamin4543 Vera Lynn the writer of the song died from Covid in 2020 and she was 103 years old

    • @gamin4543
      @gamin4543 Рік тому

      @@versus5341oh i thought something happened to the guy playing. thank you.

    • @versus5341
      @versus5341 Рік тому

      Yeah no problem 😉

  • @vtrandal
    @vtrandal Рік тому

    I love it!

  • @seacaptain72
    @seacaptain72 Рік тому

    This might have been the funniest video I've seen in some time hahaha. 1:25 was outstanding hahaha

  • @AJ-et3vf
    @AJ-et3vf Рік тому

    Great video! Thank you!

  • @bff9553
    @bff9553 Рік тому

    👍

  • @coreyrosteutcher8429
    @coreyrosteutcher8429 2 роки тому

    No sound please fix. Ty

  • @Tibug
    @Tibug 2 роки тому

    I always get chills when people hold their whiteboard pen orthogonally.

  • @igomaur4175
    @igomaur4175 2 роки тому

    wowww

  • @RealMcDudu
    @RealMcDudu 2 роки тому

    The challenge at 07:00 - the trick is to use the full (here unproven) rule f(x+y\epsilon) = f(x)+y\epsilon df/dx. That way you can also prove the chain rule.

  • @davidlearnforus
    @davidlearnforus 2 роки тому

    Hi, thank you so much! I am self lerner, with no much of formal background. Can you please explain how SUM p_i log q_i is entropy, because it does not have minus sign. If it would be log (1/q_i) we would get minus sign out of it but its not. I'm stuck there...

  • @keshavmaheshwari521
    @keshavmaheshwari521 2 роки тому

    Thank you

  • @jeremyjuwono3120
    @jeremyjuwono3120 2 роки тому

    This was such a fun video! Even though I came here just for a refresher, the video did not feel too slow. Yet I get the feeling it would be good for beginners too. In my opinion you seem to have achieved the balance of teaching to people of all levels. Amazing job!

  • @poojakabra1479
    @poojakabra1479 2 роки тому

    Can someone explain the last slide? Can’t get my head around the diamond and circle graphs

  • @DavidCymballa
    @DavidCymballa 2 роки тому

    Thank you

  • @gabrielmccartney7975
    @gabrielmccartney7975 2 роки тому

    Can we use dual numbers for integration?

    • @mattyedlin7292
      @mattyedlin7292 2 роки тому

      This is a very interesting question. Dual numbers provide a way of automating differentiation for the sequence of back propagation - local maps. I will think about this more in the coming week

    • @gabrielmccartney7975
      @gabrielmccartney7975 2 роки тому

      @@mattyedlin7292 thank you very much I'm looking forward to that!

    • @gabrielmccartney7975
      @gabrielmccartney7975 2 роки тому

      @@mattyedlin7292 What topics about dual numbers and automatic differentiation has not been studied yet?

  • @ahmednagi7074
    @ahmednagi7074 2 роки тому

    you are really great teacher, i can express my thoughts in words

  • @moaks5125
    @moaks5125 2 роки тому

    Hey Matthew. Great to sit through one of your proofs again after so many years. You’re looking great by the way. It took me back to our first year at UBC together. You me and Mike Star. . All the best Tom

    • @mattyedlin7292
      @mattyedlin7292 2 роки тому

      Hi Tom,. Thanks very much. I fondly remember our times together in that first year. It was and still is a pleasure to recall how I mentored a great student at that time. I still recall debugging with you the red-black 3d Poisson solver. I did see your comment on Linked-in about your recollection about us. I did want to reply, but things continuously went sideways. Anyway, I am glad to re-connect. If we come out east, we will certainly visit you and Heather. We will give you lots of notice. As you can tell, I am still working and having the best time at work since I started. I am a Faculty-in-Residence, doing video work for teaching. You certainly got one video hot-off the press -- I just finished it this morning and submitted it for UA-cam publishing only about 1 and 1/2 hours ago. Let's see if we can have a ZOOM at the end of March! Cheers, Matt

    • @moaks5125
      @moaks5125 2 роки тому

      @@mattyedlin7292 Morning Matt, it’s is great to reconnect in a time when everything seems to distract us. I couldn’t help but notice that your energy and enthusiasm hasn’t waned an iota. Faculty in residence sounds like a position with a lot of interesting work but also really cool side bars. I came across my code the other day and have been toying with the idea to get it up and running on a 64bit architecture machine. We’ll see. That distraction thing I mentioned earlier continues to put up other priorities. It would be great to see you both virtually and in person. The offer for a visit is an open invitation so just let us know. As for the zoom call at the end of a March, that sounds great. Let’s keep connected and make it happen. I will drop you an email. Take care my friend and teacher Cheers TwR

  • @NaysWRLD
    @NaysWRLD 2 роки тому

    Thanks a lot for this

  • @CarterColeisInfamous
    @CarterColeisInfamous 2 роки тому

    4:33 i love the Socratic method

  • @garrettosborne4364
    @garrettosborne4364 2 роки тому

    Can the old guy.