You Should Be Using Automatic Differentiation

Поділитися
Вставка
  • Опубліковано 22 сер 2024

КОМЕНТАРІ • 7

  • @SIQJoshimar12
    @SIQJoshimar12 8 років тому +2

    Nice video, doing some research into the topic myself and must say this was summarized accurately.

  • @codediporpal
    @codediporpal 3 роки тому

    Somebody finally explains that it's typically implemented by overloading basic math operations like + and sin(), where gradient values can be passed around along with the function value. I spent hours reading articles that never bothered with this essential bit of information.

  • @brandomiranda6703
    @brandomiranda6703 7 років тому +2

    So Automatic Differentiation (AD) produces exact gradients or approximations? I think he said at some point exact but the method doesn't seem exact?

    • @sophiagold5892
      @sophiagold5892 7 років тому

      Exact. Take a look at the code if you're more interested.

    • @sk.razibulislam4493
      @sk.razibulislam4493 6 років тому +4

      Exact up to machine precision

    • @hashhoomy
      @hashhoomy 5 років тому +1

      Check this for better background: ua-cam.com/video/vAp6nUMrKYg/v-deo.html

  • @brandomiranda6703
    @brandomiranda6703 4 роки тому

    Is there a link to learn about the bayesian optimization for hyperparam opt?