GELU activation function in 💯 lines of PyTorch code | Machine Learning
Вставка
- Опубліковано 23 чер 2023
- Machine Learning: Implementation of the paper "Gaussian Error Linear Units (GELUs)" in 100 lines of PyTorch code.
Link to the paper: arxiv.org/abs/1606.08415
GitHub: github.com/MaximeVandegar/Pap...
-----------------------------------------------------------------------------------------------------
CONTACT: papers.100.lines@gmail.com
#python #gelu #neuralnetworks #machinelearning #artificialintelligence #deeplearning #data #bigdata #supervisedlearning #research #activationfunction #relu #activation #function - Наука та технологія
Your videos are precious, bruh. The only person I've seen lately who actually implement and explain new paper concepts and drives other people to learn as ML engineers. Keep it up.
I am glad to hear that you like them! Thank you so much for your comment!