Few-Shot Learning & Meta-Learning in 💯 lines of PyTorch code | MAML algorithm

Поділитися
Вставка
  • Опубліковано 26 тра 2023
  • Machine Learning: Implementation of the paper "Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks" in 100 lines of PyTorch code.
    Link to the paper: arxiv.org/abs/1703.03400
    GitHub: github.com/MaximeVandegar/Pap...
    -----------------------------------------------------------------------------------------------------
    CONTACT: papers.100.lines@gmail.com
    #python #pytorch #maml #neuralnetworks #machinelearning #artificialintelligence #deeplearning #data #bigdata #supervisedlearning #research #metalearning #reptile #fewshotlearning #learning #fewshot
  • Наука та технологія

КОМЕНТАРІ • 7

  • @aritramukhopadhyay7163
    @aritramukhopadhyay7163 20 днів тому

    looks like you really like to reinvent the wheels... like was that intentional or this really cannot be implemented in a straightforward way? also in your github I saw the code no one committed a better version... may I try to write a better one and raise a PR?

    • @papersin100linesofcode
      @papersin100linesofcode  19 днів тому

      Hi, I am always open to improvements. If you think you can improve the code, please do not hesitate to make a PR

  • @TheAlx2142
    @TheAlx2142 9 місяців тому

    Is this first order maml or second order maml?

    • @papersin100linesofcode
      @papersin100linesofcode  8 місяців тому

      Excuse me for the delayed answer. This is second order because we compute the second order gradients.

    • @thomaswohrle1623
      @thomaswohrle1623 4 місяці тому +1

      @@papersin100linesofcode hey, thanks for the video. Where exactly are the second order gradients calculated? Wouldn't this necessitate create_graph=True in the inner_loop ?

    • @papersin100linesofcode
      @papersin100linesofcode  4 місяці тому

      ​@@thomaswohrle1623 thank you for your question. It is computed on line 75 where the loss depends on theta_prime. For create_graph=True, this is a great question that I would need to investigate.

    • @laxmigenius
      @laxmigenius Місяць тому

      Excellent explanation, thanks a ton!