Introduction to Directed Graphical Models | Implementation in TensorFlow Probability
Вставка
- Опубліковано 20 тра 2024
- In this video we introduce directed graphical models (DGM) with the help of a simple example. DGMs use DAGs (directed a-cyclic graphs) to model the factorization of a joint pdf/pmf. Here are the notes: raw.githubusercontent.com/Cey...
TensorFlow Probability has built-in support for those and allows for a Pythoniac way of describing them.
-------
📝 : Check out the GitHub Repository of the channel, where I upload all the handwritten notes and source-code files (contributions are very welcome): github.com/Ceyron/machine-lea...
📢 : Follow me on LinkedIn or Twitter for updates on the channel and other cool Machine Learning & Simulation stuff: / felix-koehler and / felix_m_koehler
💸 : If you want to support my work on the channel, you can become a Patreon here: / mlsim
-------
⚙️ My Gear:
(Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
- 🎙️ Microphone: Blue Yeti: amzn.to/3NU7OAs
- ⌨️ Logitech TKL Mechanical Keyboard: amzn.to/3JhEtwp
- 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): amzn.to/37katmf
- 🔌 Laptop Charger: amzn.to/3ja0imP
- 💻 My Laptop (generally I like the Dell XPS series): amzn.to/38xrABL
- 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): amzn.to/3Jr4ZmV
If I had to purchase these items again, I would probably change the following:
- 🎙️ Rode NT: amzn.to/3NUIGtw
- 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): frame.work
As an Amazon Associate I earn from qualifying purchases.
-------
Timestamps:
00:00 Opening
00:14 Introduction to DGMs
01:30 Example
05:28 Simpler Example
08:07 TensorFlow Probability
what tools/gadget/software are you using to make these videos?
I use a graphics tablet similar to the ones from WACOM. The brand is called gaomon.
On the software side I use:
- Xournal++ for writing the notes
- OBS for recording screen and audio
- FlowBlade for cutting the videos
- VS Code for the coding-heavy videos like this one: ua-cam.com/video/ISZwydaKZNY/v-deo.html
- plus smaller tools like GIMP for photo editing, Notion for collecting video ideas etc.
I work entirely on (Arch) Linux. All the files (handwritten notes and source code) are uploaded to the GitHub Repo of the channel: github.com/Ceyron/machine-learning-and-simulation
⚙ My Gear:
(Below are affiliate links to Amazon. If you decide to purchase the product or something else on Amazon through this link, I earn a small commission.)
- 🎙 Microphone: Blue Yeti: amzn.to/3NU7OAs
- ⌨ Logitech TKL Mechanical Keyboard: amzn.to/3JhEtwp
- 🎨 Gaomon Drawing Tablet (similar to a WACOM Tablet, but cheaper, works flawlessly under Linux): amzn.to/37katmf
- 🔌 Laptop Charger: amzn.to/3ja0imP
- 💻 My Laptop (generally I like the Dell XPS series): amzn.to/38xrABL
- 📱 My Phone: Fairphone 4 (I love the sustainability and repairability aspect of it): amzn.to/3Jr4ZmV
If I had to purchase these items again, I would probably change the following:
- 🎙 Rode NT: amzn.to/3NUIGtw
- 💻 Framework Laptop (I do not get a commission here, but I love the vision of Framework. It will definitely be my next Ultrabook): frame.work
As an Amazon Associate I earn from qualifying purchases.
I just found your channel, and I have to say, your videos are heavily underrated! You helped me a lot, thank you!
Thanks a lot for the nice words :) That really means a lot to me.
I love teaching, and it is fantastic to hear that the videos are of great value.
@@MachineLearningSimulation hey man, here a year later and fully agree!!
These videos has been immensely helpful for me! Keep up the great work!
Thanks so much for the kind words. Much appreciated 👍 😊
I have watched a lot of ML videos on the internet especially abt probabilistic AI and I have to say that even though you have a small audience you are probably the best UA-camr out there making videos about that. Keep on with your work, it is a blessing for more and more graduate students haha!
Thanks a lot for the comment, also under the other video :). It's much appreciated that you enjoy my way of teaching.
the TFP part is very useful and serves as nice intro.
Thanks 🙏
I'm glad it was helpful.
Hi, thank you for your great videos!
So nice of you :)
Thanks a lot.
This channel is really excellent. Subscribed now..so that I dont want to miss updates. But I dont know why youtube is not recognizing it..
Thanks a lot, Mukunthan. :) I really appreciate your support.
The channel is still small, I hope to reach a wider audience in the future. Feel free to share it with peers & friends. :)
@@MachineLearningSimulation will do my best to recommend my friends group. Am expecting this kind of channels to get recognised...This channel motivates begineers and experienced data science people to play with probability and statistics
@@mukunthanr2514 Thanks a lot, :) I really appreciate that.
Hey you mentioned three ways of declaring DAG's/Jointdist/Bayesian-networks.. Could you maybe say what the other two are so I can look it up?
Hi, thanks for the comment! :) Can you give time stamp for when I mention that in the video? It's been some time since I uploaded the video.
Honestly, this is an excellent video to explain factorization. Is it possible to make a video using PyTorch?
Thanks a lot :). Much appreciated.
I must say, I do not have much experience with PyTorch. You probably refer to the Pyro library. Unfortunately, I could not teach that with the necessary conference. Though, I believe that many ideas should easily translate from TFP to any other probabilistic framework.
At 8:00, you state that the random variable H is distributed by the piecewise function you wrote. However, isn't this notation slightly imprecise? Since writing "H ~" does not explicitly notate that the distribution is conditioned on the value of W. So rather it would be "H|W ~" or "p(H|W) = "
Hi,
you are correct. That could have been more precise. :)
I just hoped that this might simplify it for a beginner to understand ;).
Is there a way to learn the mapping weather ---> happiness from data?
Great question! Yes, there are multiple:
For instance, check out this video: ua-cam.com/video/l2f6Ic6SeqE/v-deo.html on how to use the Automatic Differentiation features of TensorFlow Probability together with gradient descent optimization in order to perform a Maximum Likelihood Estimate.
Chances are also, given you select correct distributions, that you can find analytical expressions for the parameters of the involved distributions.
Let me know if that helped :)
Awesome video! What if the weather were to have 3 categories (bad, average, good) and the happiness were to be a continuous number whose mean and standard deviation were dependent on the choice of weather?
That's a great question. 😊
Indeed, these DGMs are extremely flexible. You should be able to model this in Tensorflow Probability as well.
A common application for which you have discrete distribution over multiple categories (which is also called a "categorical distribution" ) that influences mean and standard deviation in a normal distribution is a Gaussian Mixture Model. I cover them later in the series on probabilistic machine learning.
Do you have a specific question regarding your proposed DGM for the weather?
@@MachineLearningSimulation really really grateful for your response! I’ll have to checkout those videos! And yes I do. Given one of the three weather categories, there is a mean and std that describe the amount of money spent online.
If weather is bad, a website gets an average of 700 dollars of business with std of 50 dollars.
If the weather is average, the website gets an average of 500 dollars with std of 40
If the weather is good. The website gets only 200 dollars with a std of 30.
Finally, the probability of the weather being either good, average, or bad sums to 1.
0.5 good, 0.4 average, 0.1 bad.
It would be awesome to see this worked out.
That sounds a lot like the mentioned Gaussian Mixture Model.
Then maybe check out this video from the channel: ua-cam.com/video/atDp5bkzej4/v-deo.html
Let me know if that helped 😊
@@MachineLearningSimulation thank you 🙏. Also, would these be considered formal machine learning approaches? This is a new field for me. Thanks! Really really great content. A gem.
You're welcome 😊 I'm happy when I could help.
Yes, I would consider them "formal Machine Learning approaches". If you want more details and more mathematical derivation, then I would either recommend the book "Pattern recognition and machine learning" by Christopher bishop or "Machine learning, a probabilistic perspective" by Kevin Murphy. But be aware that they are extremely tough mathematically. I try to have a lower difficulty throughout the videos.
For context: Gaussian Mixture models are commonly used to cluster data points (that is also how they are implemented for instance in scikit-learn), but they are equally valid generative models, i.e., machine learning models that can produce new (previously unseen) data observations.