![Matt Yedlin](/img/default-banner.jpg)
- 159
- 175 861
Matt Yedlin
Приєднався 5 жов 2011
Dancing with AI - A Visual Bridge from Dance to ChatGPT
How does ChatGPT work?
In this video Dr. Matthew Yedlin (Associate Professor in Electrical & Computer Engineering, UBC) and Armin Saadat (MASc Student, Applied Science at UBC) demonstrate how the pattern matching algorithms of large language models (LLM), such as ChatGPT, can be visualized through dance.
In this video Dr. Matthew Yedlin (Associate Professor in Electrical & Computer Engineering, UBC) and Armin Saadat (MASc Student, Applied Science at UBC) demonstrate how the pattern matching algorithms of large language models (LLM), such as ChatGPT, can be visualized through dance.
Переглядів: 312
Відео
Time Series Analysis with RNNS: The Model Problem Revisited
Переглядів 6084 роки тому
This video revisits the model problem within the context of RNNs. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Machine Learning: In-depth look at Long Short Term Memory
Переглядів 6974 роки тому
This video goes into an in-depth look at Long Short Term Memory Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Introduction to Gated Recurrent Units
Переглядів 2,8 тис.4 роки тому
This video introduces Gated Recurrent Units. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Machine Learning: Introduction to Long-Short Term Memory (LSTM)
Переглядів 5804 роки тому
This video talks about Long-Short Term Memory Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Limitations of the basic RNN
Переглядів 5394 роки тому
This video talks about the limitations of the basic RNN Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
RNN's: How do they work?
Переглядів 4754 роки тому
This video talks about how RNNs work. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Introduction to RNNS
Переглядів 4794 роки тому
This video introduces RNNs Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Time-Series Analysis with RNNs: A model problem
Переглядів 3074 роки тому
This video talks about Time-Series Analysis with RNNs and introduces a model problem. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Delaram Behnami Department of Computer and Electrical Engineering, University of British Columbia.
Basic Architecture of CNN
Переглядів 8654 роки тому
This video talks about the Basic Architecture of CNN Attribution-NonCommercial-ShareAlike CC BY-NC-SA Source Used: codelabs.developers.google.com/codelabs/cloud-tensorflow-mnist/index.html?index=../..index#9 Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
Detailed Architecture Replacement
Переглядів 2594 роки тому
This video talks about Detailed Architecture Replacement Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
Receptive Field: Hubel and Wiesel
Переглядів 1,4 тис.4 роки тому
This video talks about the Receptive Field by Hubel and Wiesal. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Source Used: codelabs.developers.google.com/codelabs/cloud-tensorflow-mnist/index.html?index=../..index#9 Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
High Level Introduction to CNN
Переглядів 4024 роки тому
This video is a high level introduction to CNN. Attribution-NonCommercial-ShareAlike CC BY-NC-SA Authors: Matthew Yedlin, Mohammad Jafari Department of Computer and Electrical Engineering, University of British Columbia.
Gradient Descents: Where are We and Next Steps.
Переглядів 2754 роки тому
Gradient Descents: Where are We and Next Steps.
Review of Gradient Application in Deep Learning
Переглядів 3084 роки тому
Review of Gradient Application in Deep Learning
Regularization in Nature - Nervous System
Переглядів 2564 роки тому
Regularization in Nature - Nervous System
Categorical Cross - Entropy Loss Softmax
Переглядів 16 тис.4 роки тому
Categorical Cross - Entropy Loss Softmax
Ah so dual numbers are used to reduce calculus to linear algebra. Important because linear algebra is the only thing we've gotten computers to do well. Cool!
Fine, but which is the final result? My computations bring to: -2sin(2)-6cos(2) but I am not sure I got the method properly.
we share two things in common! I am a classically trained violinist and finishing my graduate degree in applied statistics with a special emphasis on ML and building neural networks! would love to chat you up sometime and play some music :)
such a great video. This was explained exceedingly well.
Thank you!
❤
are we taking the points randomly to work on?
We sweep backwards from the loss function- using the chain rule - sweeping backwards through the computational graph
❤ this intuitive example video. Thanks so much.
Vine buscando cobre y encontré oro!!!!
Very nice explanation ✅👍
I watched one of your math videos and I thought - this guy has to play music. ! !
Dude, this is awesome. Thanks
how much is an intercontinental ballistic missle?
Thank youu❤
I’m glad you liked this video. Please remind me in class on Wednesday next to tell a story about this video. Thank you! ( Friday I will be recovering from eye surgery and Sam will take the class.
@@mattyedlin7292 ok sir ✌️
@@mattyedlin7292 get better soon sir 🙏🏽
@@MAX_TV99 Thank you. I am steadily recovering 😊
Not we’ll meet again but it’s nice
This is absolutely beutiful
cool
Welcome to Hogwarts Mr. Potter! lol
Wow! Hats of to you guys, perfect in demystifying Categorical Cross-Entropy.... thank you!
It's a good video but good grief, encourage people to take a week-long course in linear algebra. If you keep feeding them summation symbols and indices they will never do it. It's HARDER, not easier to spell it all out. Professor Strang's course is probably still on youtube if you are interested. You will gain back that week by being twice as productive in the week after. Not to mention the rest of your life.
Hello Lucy Thank you for your input! Always interested in comments to improve videos. Would you suggest any additional material to address the summation issue. I learned it in high school as a prelim to proof by induction - a long time ago.
I come for the acting, not the academic content.
verry interesting
Love your method
Thank you!
*(EDIT: Solved it, see comment reply)* I don't follow how you go from the case of numerical example, where the likelihood is a product of predicted and observed probabilities p_i and q_i each raised to the number of times they occur, to the algebraic expression of the likelihood where you take the product of q_i raised to N * p_i (or is that N_p_i? I'm a little unsure if the p_i is a subscript of the N or multiplied by it).
I worked it out. The answer is to remember that the number of times the outcome i, with probability p_i occurs can be expressed by rearranging the definition p_i = N_p_i / N and substituting this into the expression for the likelihood in the general form that follows from the numerical example: L = Π q_i ^ N_p_i , giving L = Π q_i ^ N*p_i
Hey, was wondering where did the speed number come from? 1/7*10^8 ?
Thank you very much for the simplicity.
Amazing
Thank you very much
This is exactly what Elon musk was talking about. Teachers need to make it fun and exciting like this video for student to leaen
Rest In Peace legend 😊
what happened
@@gamin4543 Vera Lynn the writer of the song died from Covid in 2020 and she was 103 years old
@@versus5341oh i thought something happened to the guy playing. thank you.
Yeah no problem 😉
I love it!
This might have been the funniest video I've seen in some time hahaha. 1:25 was outstanding hahaha
Great video! Thank you!
👍
No sound please fix. Ty
I always get chills when people hold their whiteboard pen orthogonally.
wowww
The challenge at 07:00 - the trick is to use the full (here unproven) rule f(x+y\epsilon) = f(x)+y\epsilon df/dx. That way you can also prove the chain rule.
Hi, thank you so much! I am self lerner, with no much of formal background. Can you please explain how SUM p_i log q_i is entropy, because it does not have minus sign. If it would be log (1/q_i) we would get minus sign out of it but its not. I'm stuck there...
Thank you
This was such a fun video! Even though I came here just for a refresher, the video did not feel too slow. Yet I get the feeling it would be good for beginners too. In my opinion you seem to have achieved the balance of teaching to people of all levels. Amazing job!
Can someone explain the last slide? Can’t get my head around the diamond and circle graphs
Thank you
Can we use dual numbers for integration?
This is a very interesting question. Dual numbers provide a way of automating differentiation for the sequence of back propagation - local maps. I will think about this more in the coming week
@@mattyedlin7292 thank you very much I'm looking forward to that!
@@mattyedlin7292 What topics about dual numbers and automatic differentiation has not been studied yet?
you are really great teacher, i can express my thoughts in words
Hey Matthew. Great to sit through one of your proofs again after so many years. You’re looking great by the way. It took me back to our first year at UBC together. You me and Mike Star. . All the best Tom
Hi Tom,. Thanks very much. I fondly remember our times together in that first year. It was and still is a pleasure to recall how I mentored a great student at that time. I still recall debugging with you the red-black 3d Poisson solver. I did see your comment on Linked-in about your recollection about us. I did want to reply, but things continuously went sideways. Anyway, I am glad to re-connect. If we come out east, we will certainly visit you and Heather. We will give you lots of notice. As you can tell, I am still working and having the best time at work since I started. I am a Faculty-in-Residence, doing video work for teaching. You certainly got one video hot-off the press -- I just finished it this morning and submitted it for UA-cam publishing only about 1 and 1/2 hours ago. Let's see if we can have a ZOOM at the end of March! Cheers, Matt
@@mattyedlin7292 Morning Matt, it’s is great to reconnect in a time when everything seems to distract us. I couldn’t help but notice that your energy and enthusiasm hasn’t waned an iota. Faculty in residence sounds like a position with a lot of interesting work but also really cool side bars. I came across my code the other day and have been toying with the idea to get it up and running on a 64bit architecture machine. We’ll see. That distraction thing I mentioned earlier continues to put up other priorities. It would be great to see you both virtually and in person. The offer for a visit is an open invitation so just let us know. As for the zoom call at the end of a March, that sounds great. Let’s keep connected and make it happen. I will drop you an email. Take care my friend and teacher Cheers TwR
Thanks a lot for this
4:33 i love the Socratic method
Can the old guy.