MIT 6.S191 (2019): Biologically Inspired Neural Networks (IBM)

Поділитися
Вставка
  • Опубліковано 9 січ 2025

КОМЕНТАРІ • 20

  • @StevenAkinyemi
    @StevenAkinyemi 4 роки тому +12

    I have watched a lot of content with different English accents that this accent does not bother me. Which is how it should be. English has become a universal language. American accent shouldn't be the default.
    Awesome content BTW!

  • @abitfrosty
    @abitfrosty 5 років тому +45

    That sense when you're russian and watching MIT courses in english with russian lecturer.

    • @igorpopov9384
      @igorpopov9384 5 років тому +10

      Dude, I am Russian, but shit, it is hard to understand him for me as well:D I thought this kind of accent doesn't exist at all, mb in some sort of humor... but, damn, here it is...:D And it is hard to understand just like an Indian accent:)) I can't even understand how is it possible to be a lecturer in mit and to have such accent:D

    • @НатальяСоколова-я1г
      @НатальяСоколова-я1г 5 років тому +3

      @@igorpopov9384 O Vy iz Anglii! dude! He is as Russian as you, you could have written that in Rusian.

  • @serge1444
    @serge1444 4 роки тому +1

    Comment for 16:55 : to train one output layer, you don't need backpropagation. You can use noniterative learning as in Extreme Learning Machines

  • @Xraid32
    @Xraid32 5 років тому +14

    Biological Thinking Intensifies.

  • @volotat
    @volotat 5 років тому +6

    Thank you for publishing these lectures.

  • @CagnPolat
    @CagnPolat 4 роки тому +2

    Thank you for the great lecture. Could you please refer a book with some more explanation about the differential equations and the physics behind those equations?

  • @winviki123
    @winviki123 5 років тому +6

    Please add subtitles.

  • @davidding3125
    @davidding3125 5 років тому +3

    Very inspiring lecture. Thanks for making these available!

  • @youtubeadventurer1881
    @youtubeadventurer1881 5 років тому +3

    I think it's worth pointing out that backpropagation isn't directly competing with learning algorithms in biological brains. Presumably those biological learning algorithms would be of little use without the innate knowledge accumulated through hundreds of millions of years of evolution. Backpropagation is actually largely competing with those evolutionary processes.

    • @user93237
      @user93237 5 років тому +2

      Though some amount of hillclimbing/evolution is likely happening in the brain because the way abilities improve is very evolution-like (increments on prior experience and one can only learn what one already almost knows). What is missing in AI is non-evolutionary one-shot learning, e.g. how you remember what you ate for breakfast.

  • @serge1444
    @serge1444 4 роки тому +1

    Mister Chekov, start the transmission!

  • @mahdibejani7166
    @mahdibejani7166 3 роки тому +1

    I suggest give a lecture about spike neural networks and learning of them. I think it is suitable for next lecture!

  • @ashishvishwakarma5362
    @ashishvishwakarma5362 5 років тому +2

    Thanks for sharing

  • @KadirErturk
    @KadirErturk 3 роки тому

    great approach to the problem. that is more realistic then bp. I will try to implement my MySQL brain model and see if helps. Thanks for sharing

  • @jhordyhowersanchez6852
    @jhordyhowersanchez6852 3 роки тому

    Muchas Gracias

  • @Zvgrujhgkkkjhgfdfcxxjj
    @Zvgrujhgkkkjhgfdfcxxjj 4 роки тому

    Crystal clear explanation!

  • @jep1912
    @jep1912 4 роки тому

    Brainchip Inc.