How to handle Uncertainty in Deep Learning #1.2
Вставка
- Опубліковано 29 чер 2024
- ▬▬ Code ▬▬▬▬▬▬▬▬▬▬▬
Colab Notebook: colab.research.google.com/dri...
▬▬ Used Music ▬▬▬▬▬▬▬▬▬▬▬
Music from Uppbeat (free for Creators!):
uppbeat.io/t/pryces/lateflights
License code: 3O8NFX8WUHJBR2SB
▬▬ Used Videos ▬▬▬▬▬▬▬▬▬▬▬
From these Pexels authors:
Tom Fisk
▬▬ Timestamps ▬▬▬▬▬▬▬▬▬▬▬
00:00 Introduction
00:15 Dataset
01:38 Model 1
07:37 Model 2
12:26 Model 3
▬▬ Support me if you like 🌟
►Link to this channel: bit.ly/3zEqL1W
►Support me on Patreon: bit.ly/2Wed242
►Buy me a coffee on Ko-Fi: bit.ly/3kJYEdl
►E-Mail: deepfindr@gmail.com
▬▬ My equipment 💻
- Microphone: amzn.to/3DVqB8H
- Microphone mount: amzn.to/3BWUcOJ
- Monitors: amzn.to/3G2Jjgr
- Monitor mount: amzn.to/3AWGIAY
- Height-adjustable table: amzn.to/3aUysXC
- Ergonomic chair: amzn.to/3phQg7r
- PC case: amzn.to/3jdlI2Y
- GPU: amzn.to/3AWyzwy
- Keyboard: amzn.to/2XskWHP
- Bluelight filter glasses: amzn.to/3pj0fK2 - Наука та технологія
This is gold. Thanks for the thorough and great content!
Thanks!!
thanks, that was really helpful! Looking forward to part 3.
Thank you! Next part is coming next week :)
One thing i want to say is that, test data usually only use once as test time, i think it is better call the "test data" as validation data.
Brilliant!!! These videos help me a lot in understanding uncertainty. Could you make more videos regarding this topic? Thank you so much.
Thanks!
It's probably worth mentioning you are computing gradients on your test set by not setting torch.no_grad() for the test loop. This series is all about uncertainty so it's important you aren't computing gradients on your test set which leaks into your mu and var values, which in the end is contrary to what you are trying to show.
Hi! Good remark. But as long as you aren't running back propagation w.r.t. to the test loss it won't leak any information into the model weights. Torch.no_grad is mainly used for memory reasons
Thanks for the video...i learn a lot..Please upload some videos of uncertainty estimation in image classification.
Thanks! For image classification the same principles apply. You just have other layers (Conv2d) instead of Linear.
What if p(y|x) is not Gaussian? What is y is high dimensional?
There are alternative loss function for other distributions like Laplace ect. or you transform the target variable in some way to match a supported distribution.
Multidimensionality is no problem for GaussianNNL Loss, it simply apply the calculation per dimesion and averages it.
@@DeepFindr Thanks for the helpful reply. A following up question: what about we use conditional generative model to handle aleatoric uncertainty?
Sure that's also a reasonable approach. You can learn the data distribution and detect out of distribution samples using generative models.