Zan Huang | Adapting Neurological Structures to AI

Поділитися
Вставка
  • Опубліковано 29 вер 2024

КОМЕНТАРІ • 11

  • @antonystringfellow5152
    @antonystringfellow5152 2 дні тому +2

    Excellent!
    I look forward to seeing his paper and hope that those working on deep neural networks will take a good look at it too.
    I also look forward to seeing him back here, going through his paper.

  • @eismccc
    @eismccc 21 годину тому +1

    That was amazing thank you so much, I am very much on board with my own pathway to neural AI that is closer to actual neurons, where very minimal information is used to understand even more information using very little energy, as opposed to current technology that requires huge amounts of data to understand very little and requiring huge amounts of energy. Well done sir

  • @richardg.lanzara3732
    @richardg.lanzara3732 День тому +1

    Very interesting talk! You mentioned the importance of Markov Blankets and the concept that it is Markov Blankets all the way down so to speak. Perhaps we need better algorithms to describe these Markov Blankets, which I offer in my book, "Origins of Life's Sensoria"?

  • @nathanhelmburger
    @nathanhelmburger 2 дні тому +1

    I love this! I've got some related talks to share.
    Platonic Representation Hypothesis: ua-cam.com/video/V7AyriUcXZQ/v-deo.html
    Physics of Language Models: ua-cam.com/video/kf_eGgVtOc/v-deo.html

  • @AGI-Corp
    @AGI-Corp 2 дні тому +1

    Zan Huang offers a new Paradigm shift that has a laser focus and gives a clear lens that VC should be using when picking investments .
    "There's a lot of money going to AI but it's all going to a particular thing it's all going to LLMs
    scaling and it's really unfortunate because there's actually such good work being done it's it's still the minority. there's some where I am ,at MIT, there's some on the coast but for the most part it's a lot of money and it's just a lot of smart people who are not thinking about it who I I do think for example Ilia S. is a really brilliant
    person but he also makes claims such as these papers are the only papers you need to read to know what's exciting
    right now and it's the transform paper at his own sequence to sequence model but that isn't really doing anyone"
    Zan Huang

  • @vedanirvana111
    @vedanirvana111 18 годин тому

    Beautiful Symmetry indeed! Looking forward to the paper.

  • @EDG-r7t
    @EDG-r7t 2 дні тому

    Drawing triangle fractle on a paper n rolling it like preparing it for wormhole ......n folds..so even if u chop somewhere....u would be able to calculate it.....folding would change with time on different layers....n building learning models...maybe

  • @MIIIM-7
    @MIIIM-7 19 годин тому

    36:28 is the one designated to work in earthly humans

  • @nathanhelmburger
    @nathanhelmburger 2 дні тому

    Oh, and this one for the importance of percolation / phase-change-boundary in the context of a model of the brain: ua-cam.com/video/Zwm6EnDMInc/v-deo.html

  • @JTedam
    @JTedam День тому +1

    It's possible that learning begins at the cellular level during the embryonic stage? For example, a newborn horse instinctively knows how to stand and walk. This might be due to abilities encoded in their DNA, passed down from their parents like a pre-trained model. If true, it suggests that learning continues across generations, improving the reading model of the brain and enhancing the species' abilities over time.

    • @nathanhelmburger
      @nathanhelmburger 14 годин тому

      I'm pretty sure that it's simpler than that. I think that, over evolutionary history, horses whose genes encoded worse priors in their fetal brains died more often. The horses we see today are the descendants of survivors who lucked into better genetic priors. Evolution is brutal, it kills the dumb and inflexible. Let's romanticize cultural transmission through child rearing, not the cruel methods of evolution.