From “Artificial” to “Real” Intelligence - Major AI breakthroughs in 5 Minutes (1957-2022)

Поділитися
Вставка
  • Опубліковано 1 тра 2022
  • From the times no one believed in artificial neural networks (ANN) to the present time when they are ubiquitous, to a plausible future where they could surpass human intelligence - here is a 5 minutes summary of the defining moments in AI research from 1957 to 2022.
    VIDEO CREDITS - the original video is taken from "Kung Fu Panda 2008".
    --------------------------------------------------
    Storyline & Note-Worthy Events
    --------------------------------------------------
    00:00:21 : [The first Artificial Neural model]
    Perceptron was the successful attempt at creating a replica of a biological neuron and its connections onto machines. The project generated a lot of excitement.
    00:00:34 : [AI Winter]
    In 1965, Marvin Minsky published the book “Perceptrons”. It pointed out various flaws in perceptrons. This led to a period which is often termed as the first “Winter of AI”. During this time, Artificial Neural Networks (ANN) fundings were pulled, and mainstream ML conferences such as ICML and IEEE routinely rejected papers bearing the words “neural networks” in the title. Other conventional methods such as SVM and Decision trees occupied the center stage.
    00:01:00 : [Birth of Multi-layer Perceptron]
    Things started changing when the multi-layer perceptron was introduced. Thanks to the introduction of non-linear activation functions, learning using multiple layers was now possible. This answered several criticisms pointed out by Minsky previously. Back-propagation furthered the progress by letting the ANN fine-tune weights, by treating it as a credit assignment problem.
    00:01:26 - [CNNS and the advent of modern Computer Vision]
    LeNet was a breakthrough in computer vision. It utilized back-propagation and convolutions/pooling layers concepts, and was able to identify handwritten US zip codes with an accuracy of 1%. It was still severely constrained by limited data and computational resources.
    00:02:37 - [AlexNet and the Monumental ILSVRC 2012 contest]
    While conventional models performed quite well, the true prowess of ANN could only be demonstrated with huge amounts of real-world data. ImageNet data set (14 million labeled images) gave ANNs just that opportunity. During the ILSVRC 2012 contest, AlexNet (a deep neural network) devoured all its contestants with an accuracy of 85%. This marked the beginning of a deep learning revolution.
    00:04:28 - [Singularity]
    “Singularity” is often the term coined for the point in time when AI surpasses intelligence.
    ---------------------------------
    Papers / References
    ---------------------------------
    00:00:21 : [Perceptron]
    (Rosenblatt, 1957) - “The Perceptron: A Perceiving and Recognizing Automaton”
    00:00:34 : [Criticism of Perceptron]
    (Minsky, 1965) - Marvin Minsky published the book “Perceptrons”. It pointed out various flaws in perceptrons.
    00:01:00 : [Non-Linear Activation Function Introduced]
    (Ivakhnenko, 1968) - “The group method of data handling - a rival of the method of stochastic approximation”
    00:01:19 - [Convolution & Pooling Layers Introduced]
    (Fukushima, 1980) - “Neocognitron: A Self-organizing Neural Network Model for a Mechanism of Pattern Recognition Unaffected by Shift in Position”
    00:01:26 - [Application of Back Propagation to ANNs]
    (Hinton, 1986) - “Learning representations by back-propagating errors”
    00:01:34 - [LeNet]
    (LeCun ,1998) - “Gradient Based Learning Applied to Document Recognition”
    00:02:24 - [Dropout]
    (Hinton, 2012) - “Improving neural networks by preventing co-adaptation of feature detectors”
    00:02:25 - [Residual Connections]
    (He, 2016) - “Deep residual learning for image recognition”
    00:02:35 - [ReLU]
    (Nair, 2010) - “Rectified linear units improve restricted boltzmann machines”
    00:02:37 - [ImageNET]
    (Li, 2009) - “A large-scale hierarchical image database”
    00:02:48 - [AlexNet]
    (Krizhevsky, 2012) “ImageNet classification with deep convolutional neural networks”
    00:30:00 - [GANs]
    (Goodfellow, 2014) “Generative adversarial nets”
    00:30:06 - [AlphaGo]
    (Silver, 2016) “Mastering the game of Go with deep neural networks and tree search”
    00:30:14 - [RNN]
    (Hopfield, 1982) “Neural networks and physical systems with emergent collective computational abilities”
    00:30:15 - [LSTM]
    (Hochreiter, 1997) “Long short-term memory”
    00:30:18 - [Transformers]
    (Vaswani, 2017) “Attention is all you need”
    00:30:20 - [BERT]
    (Devlin, 2018) “Bert: Pre-training of deep bidirectional transformers for language understanding”
    00:30:21 - [JukeBox]
    (Dhariwal, 2020) “Jukebox: A generative model for music”
    00:30:22 - [GPT-3]
    (Brown, 2020) ”Language models are few-shot learners”
    00:30:23 - [LamBDA]
    (Thoppilan, 2022) “LaMDA: Language Models for Dialog Applications”.
  • Наука та технологія

КОМЕНТАРІ • 23

  • @medexamtoolsdotcom
    @medexamtoolsdotcom 2 роки тому +5

    "Back propagation". Was that an intentional pun, since that was the moment he sent the pieces of the trap flying by using the muscles in his back?

  • @omkiranmalepati1645
    @omkiranmalepati1645 Рік тому +5

    I can't I can't appreciate this more 😭 Superbly compiled...

  • @danberm1755
    @danberm1755 5 місяців тому

    This is fantastic! What a well thought out analogy. Thanks for sharing it!
    Got goosebumps at a couple of points such as "back propagation".

    • @danberm1755
      @danberm1755 5 місяців тому

      In the "future improvements" section you could add: prompt engineering, series of experts training, mamba, Q Star, asic transformer hardware.
      I'd remove quantum computing. Not going to be a major player in the singularity.

    • @HeduAI
      @HeduAI  5 місяців тому +1

      @@danberm1755 Thanks! This video is over one year old. Prompt engineering wasn't really a thing back then. I think I should upload an updated version of this :D

  • @neanderthalancestor4955
    @neanderthalancestor4955 Рік тому

    Oh man!!, This is epic!, I rarely comment on youtube, But this was honestly soo good that I had to give it to you. Rarely does watching a video makes my day! I love the style of clubbing movie references and AI, my two biggest love in life. I actually wanted to learn about transformers but loved your style so much I ended up watching every single video in your channel.
    Keep up the amazing work!!

  • @jackhanke343
    @jackhanke343 2 роки тому

    Great video!

  • @takudzwamakusha5941
    @takudzwamakusha5941 2 роки тому

    Good to see new content! More explainer videos please. This time vision transformers maybe?

    • @HeduAI
      @HeduAI  2 роки тому +2

      Yep! Working another video. Will come back to architectures after that :)

  • @JasminShah
    @JasminShah Рік тому

    This is so well made! Perhaps it's time to make some changes and re-upload with the recent developments in the field? 😄

  • @jubayerislam4944
    @jubayerislam4944 Рік тому

    where have u gone? please keep uploading

  • @pratikpratik8495
    @pratikpratik8495 2 роки тому

    nice one :)

  • @pingshengli7745
    @pingshengli7745 Рік тому +3

    Po == No Free Lunch Theorem (Wolpert, et al., 1997) ;) An eternal limitation for all intelligence systems.

    • @HeduAI
      @HeduAI  Рік тому +1

      Haha! Good one ;D

  • @vikramnimma
    @vikramnimma 2 роки тому

    Excellent..

  • @Ruasack
    @Ruasack 2 роки тому

    I understood like 5% of those terms but still loved the video

    • @HeduAI
      @HeduAI  2 роки тому

      5% is a good start! :) Keep pushing forward ...

  • @VijayBhaskarSingh
    @VijayBhaskarSingh Рік тому

    haha.. this is definitely subconscious mind!! :D

  • @phieyl7105
    @phieyl7105 2 роки тому

    Oh lord he commin

  • @raunakdey3004
    @raunakdey3004 Рік тому

    This vid is so cute !

  • @phieyl7105
    @phieyl7105 Рік тому

    ChatGPT

  • @peanutgallery4
    @peanutgallery4 2 роки тому

    This droidie propaganda won't work
    Good video though