Building Machines that Learn & Think Like People - Prof. Josh Tenenbaum ICML2018

Поділитися
Вставка
  • Опубліковано 9 тра 2024
  • Recorded July 13th, 2018 at the 2018 International Conference on Machine Learning
    Joshua Tenenbaum is Professor of Cognitive Science and Computation at the Massachusetts Institute of Technology. He is known for contributions to mathematical psychology and Bayesian cognitive science.
    icml.cc/Conferences/2018/Sche...
  • Наука та технологія

КОМЕНТАРІ • 23

  • @dl1129
    @dl1129 5 років тому +3

    Thank You for the upload.

  • @semidemiurge
    @semidemiurge 5 років тому +1

    fascinating. A promising path forward. Thanks for the upload

  • @deeplearningpartnership
    @deeplearningpartnership 5 років тому +9

    Great talk.

    • @dalesmith8403
      @dalesmith8403 5 років тому

      The A eye world will mature in the coming Widget World that is pushed by our nature towards convenience.
      The Social issues will slide into lame protests while whole systems of long term off world ingineering and mining resources out past Mars,
      in the broken planet parts are being exploited for large unbelievable projects on auto pilot.
      Because,
      Aeye is not old world animal based, it's self awareness will be in the niche filling context unknown to our kind.
      History shows hints if one learns of the Gen Times in Lore, which gets inspired to be comparable to out Desire to reason.
      We cycle up into the metamorphosis of the butterfly.
      And, then fall by reasons as vast as the hairs on your head.
      if, one looks long enough,
      the story tells it's self, like the spider webs on a morning path.
      And the Children of the Whole Wide World are Watching !

  • @robinranabhat3125
    @robinranabhat3125 11 місяців тому

    At 10:56, Josh talks about kid planning it's way to build that tower. but what's the motivation in the first place for that child to attempt to build the tower ? i

  • @cirithduath7526
    @cirithduath7526 5 років тому +2

    My only complaint with this channel is that the videos are so quiet. I work in a load environment and I have trouble making out the voices.

  • @machinistnick2859
    @machinistnick2859 3 роки тому

    thanks

  • @bruhweexist
    @bruhweexist 5 років тому +2

    Hello, Is there a place I can watch recordings of the NAMPI conference referenced here? Thanks. :)

  • @alexsmith2526
    @alexsmith2526 5 років тому +2

    before you can step forward look to the in built programming - in the DNA - of the child inherent knowledge -figure that out and move forward -not all children learn at the same rate - it must be something built in to the inherent DNA sequence of the individual

    • @iriya3227
      @iriya3227 5 років тому

      Interesting point, however I always thought normal children learn at the roughly the same rate. Children with Genetic problems on the other hand learn at slower rate. Still interesting why some children learn somethings faster than others. Not sure if it's environment or genetics.

  • @darrendwyer9973
    @darrendwyer9973 5 років тому +2

    the brain works in cycles, so many frames per second, taken from the visual input pathways... the 6 month old has how many frames of learning? over 6 million iterations. An 18 month-old has over 18 million brain iterations. and so on and so forth.... what seems to happen is that at any given point in time, the most important ideas form, and less important ideas decline. Consciousness seems to derive from all these millions of iterations of the visual world and impact of reality. imagine life as a continuous movie, not slide shows.

  • @jmw1500
    @jmw1500 2 роки тому +1

    53:54 this is pretty much a lie. Both systems try to solve physical state spaces.
    Neural networks (or differential programming) try this with simplifying assumptions about smoothness, sure. But it is one of the basic ideas from mathematical analysis that sine waves and polynomials can approximate pretty much any other function. It is part of the main body theorems that have been known for decades about neural networks that they are fairly universal approximators. This includes approximating functions that are not smooth, or anything that would count as a program.

  • @jwhite8086
    @jwhite8086 5 років тому +1

    When you do DMT, if you watch the world eyes open, meaning deconstructs everything turns into Lego world on a grid, you see lines of predicted motion and notice the mind tracking patterns and moving between levels of resolution and perspective.

    • @jwhite8086
      @jwhite8086 5 років тому +1

      Kintsugi G maybe the AI is on the right track then

  • @ahmaddh6408
    @ahmaddh6408 4 роки тому

    i thought jeff ross would teach us about A.I

  • @JinanKB
    @JinanKB 2 роки тому

    Isn't it strange that we never bothered to ask how children learn before setting up the present educational paradigm which is based on 'how to teach children' and not based on 'how children learn'! The paradox is that even now we are interested in knowing how children learn in order to make AI and not to reimagine education.

    • @WilliamThomas2040
      @WilliamThomas2040 Рік тому +1

      we created schools to train people to work in factories, not to learn for it's own sake

  • @KnThSelf2ThSelfBTrue
    @KnThSelf2ThSelfBTrue 5 років тому

    I think this is fascinating, but I also think that preparing for the systemic cultural biases that will emerge from completely imitating the learning behavior of infants is an important ethical consideration.

  • @sgrimm7346
    @sgrimm7346 Рік тому

    True AI, the real thing, won't be realized via a computer....it will be electronic, just not a computer.

  • @dcngn_
    @dcngn_ 4 роки тому +1

    Can the grudge please stop trying to die somewhere near a microphone? I'm trying to listen to a talk