GELU activation function in 💯 lines of PyTorch code | Machine Learning

Поділитися
Вставка
  • Опубліковано 23 чер 2023
  • Machine Learning: Implementation of the paper "Gaussian Error Linear Units (GELUs)" in 100 lines of PyTorch code.
    Link to the paper: arxiv.org/abs/1606.08415
    GitHub: github.com/MaximeVandegar/Pap...
    -----------------------------------------------------------------------------------------------------
    CONTACT: papers.100.lines@gmail.com
    #python #gelu #neuralnetworks #machinelearning #artificialintelligence #deeplearning #data #bigdata #supervisedlearning #research #activationfunction #relu #activation #function
  • Наука та технологія

КОМЕНТАРІ • 2

  • @user-td8vz8cn1h
    @user-td8vz8cn1h 8 місяців тому +2

    Your videos are precious, bruh. The only person I've seen lately who actually implement and explain new paper concepts and drives other people to learn as ML engineers. Keep it up.

    • @papersin100linesofcode
      @papersin100linesofcode  8 місяців тому

      I am glad to hear that you like them! Thank you so much for your comment!