MIT 6.S191: Convolutional Neural Networks

Поділитися
Вставка
  • Опубліковано 25 гру 2024

КОМЕНТАРІ • 50

  • @johnpuopolo4413
    @johnpuopolo4413 2 місяці тому +7

    Great series! Thanks for making the concepts approachable. These lectures are at a perfect level for understanding key concepts and for having the vocabulary and foundation for understanding other available materials. I especially found Ava's overview of Transformers and how the Q, K, and V matrices relate an "a ha" moment! Thank you, all.

  • @husseinekeita8909
    @husseinekeita8909 7 місяців тому +16

    Thank you for sharing quality content like this for free for several years

  • @bytegraftkids
    @bytegraftkids 6 місяців тому +5

    I don't even need to be in MIT to learn from them! Outstanding and clear delivery of difficult concepts.Thank you.

  • @mahmoudjafari-tk6ry
    @mahmoudjafari-tk6ry 4 місяці тому +3

    Dear Amini.was good trech too especially navigation too

  • @PerceptronsAI
    @PerceptronsAI 6 місяців тому +1

    I wanted to extend my sincere thanks for the wonderful lecture you delivered on Deep Learning.

  • @aiwroy
    @aiwroy 7 місяців тому +1

    While sliding window is good, YoLo outperforms Faster RCNN and is generally considered state of the art for object detection

  • @vijaykumars1771
    @vijaykumars1771 6 місяців тому +2

    Thank you, i have one doubt here, at 15:30 you said 10 k neurons in hidden layer for processing 10k parameters, so resultant would be 10k^2 parameters. My doubt is why we need 10 k neurons at any layer. we can decide the number of layers right?

    • @primedanny417
      @primedanny417 4 місяці тому

      It's just an example, choosing # of neurons and # of layers is an engineering task. Models tend to be able to solve complex tasks better the deeper (or wider) they are, and an example with a 100 x 100 image with 1 fully-connected hidden layer of 10,000 neurons would have >100M connections/weights.

    • @xxyyzz8464
      @xxyyzz8464 3 місяці тому

      @@primedanny417True, but there are plenty of examples of fully connected networks that work and train well on 128x128 sized grayscale images, for example. I know they aren’t HD quality or SoTA by any means, but to say FC nets are “completely impractical” as a blanket statement is a little strong IMO. Great lecture series-this is nit-picking here. We might as well criticize using the term “convolutional” without explaining it’s typically implemented as a cross-correlation and not a convolution while we’re at it! 😆

  • @wuyanfeng42
    @wuyanfeng42 Місяць тому

    OMG, it's so intuitive !🤩

  • @woodworkingaspirations1720
    @woodworkingaspirations1720 7 місяців тому +2

    Waiting patiently

    • @o__bean__o
      @o__bean__o 7 місяців тому +2

      That's the spirit

  • @DreamBuilders-rq6km
    @DreamBuilders-rq6km 7 місяців тому +1

    Thanks for sharing this knowledge. Be blessed

  • @fideslegoale9611
    @fideslegoale9611 4 місяці тому

    Thank you for courses we are learning lot

  • @karterel4562
    @karterel4562 7 місяців тому

    thank for sharing that course , that's so usefull !

  • @jteichma
    @jteichma 5 місяців тому

    Great courses thanks!❤

  • @leesiheon8013
    @leesiheon8013 5 місяців тому

    Love the lecture!

  • @htoorutube
    @htoorutube 7 місяців тому +5

    Software Lab 1 still not made available, when will that happen?

  • @zhspartan9993
    @zhspartan9993 4 місяці тому

    Thanks for the lecture

  • @sudhirkothari
    @sudhirkothari 4 місяці тому

    fantastic ! thank you for the lectures

  • @ajaywanekar9136
    @ajaywanekar9136 5 місяців тому

    Very nice Explanation

  • @ghaithal-refai4550
    @ghaithal-refai4550 7 місяців тому

    Thank you very much, it is a great lecture. I hope that you develop the lectures over the years as it seems to be the same contents. topics like pretrained models and knowledge transfer, YOLO might be good to be added to CNN

  • @noushadarakkal5179
    @noushadarakkal5179 2 місяці тому

    Thanks for this great lecture series.
    However the audio is muffled at some points

  • @genkideska4486
    @genkideska4486 7 місяців тому +2

    Waiting ..

  • @shahriarahmadfahim6457
    @shahriarahmadfahim6457 7 місяців тому +10

    But the lab between Lecture 2 and 3 is still not published in the website?

    • @benjaminy.
      @benjaminy. 7 місяців тому +5

      I think it is not their practice to publish their lab work

    • @RajeevKumar-dq4ct
      @RajeevKumar-dq4ct 7 місяців тому +4

      It has been published now

  • @ajayrathore7045
    @ajayrathore7045 5 місяців тому +2

    The lecture is awesome but the quality of audio is very poor.

  • @albertmills9365
    @albertmills9365 2 місяці тому

    It's weird that he uses Boston Dynamics robots in his first slides, since boston dynamics has gone on record saying they don't use AI.

  • @suhaimiseliman8593
    @suhaimiseliman8593 6 місяців тому

    EACH COLOR-
    f RANGE.
    ACTIVE CMOS SENSOR...
    PHOTON>e BEAM
    IF 3 LED CAN PRODUCE MULTICOLOR,
    I 🤔 I CAN USE R,G & B BANDPASS FILTER TO GET THE SAME RESULT VIA SPECIAL PURPOSE DIGITAL OSCILLOSCOPE..😎😉

  • @meshkatuddinahammed
    @meshkatuddinahammed 6 місяців тому

    I have a confusion about the Lab 2 Part 2 ( facial Detection with CNN). It has been claimed that in the CelebA dataset most faces are of light skinned females. But the model ultimately gives lower accuracy for this category of faces compared to other three categories. Why is that?

    • @zahramanafi4793
      @zahramanafi4793 5 місяців тому +1

      Where did you find the labs? Are they available on UA-cam?

  • @samiragh63
    @samiragh63 7 місяців тому +1

    Cant wait...

  • @marlhex6280
    @marlhex6280 6 місяців тому

    Hello Alex, please enlighten the peasants with a juicy time series episode? If you had been my teacher since I was a kid, I would be a different person today. Thank you for this, grateful today and in the future.

    • @IvanAnishchuk
      @IvanAnishchuk 5 місяців тому

      Time series intro lecture would be great to watch indeed!

  • @darylltempesta
    @darylltempesta 4 місяці тому

    I love you but the Keller Paradox points to overlooked emergence.

  • @jsherborne92
    @jsherborne92 5 місяців тому +2

    Great content, but audio sounds like it was recorded with a toaster

  • @jorgeguiragossian8488
    @jorgeguiragossian8488 7 місяців тому +1

    Have any of the labs been published yet?

  • @patrickmultimedia
    @patrickmultimedia 11 днів тому

    holy smokes!

  • @tmcgraw
    @tmcgraw 7 місяців тому

    right?

  • @Mantra-x1d
    @Mantra-x1d 4 місяці тому

    Testing

  • @abdelazizeabdullahelsouday8118
    @abdelazizeabdullahelsouday8118 7 місяців тому

    Thank you for sharing, please i need a help and i send an email to you but no response, could you please help me?
    thanks in advance.

  • @patrickmultimedia
    @patrickmultimedia 11 днів тому

    somethings gotta be done about the mic with the questions its absolutely horrible sound!!!

  • @jackymarcel4108
    @jackymarcel4108 2 місяці тому

    Jackson Thomas Thomas Charles Thomas Donald

  • @AnuwktootLee-yf9ff
    @AnuwktootLee-yf9ff 7 місяців тому

    Bahia hu hum ab hum tum huneaha sath rahneged university kit universe abantw aur oyra Karen’s gaadi mwd humne svn layered D muje apne array Adamu aki fire m me stover ki emowpwr hitw rehte hua is

  • @sansdieutechstreetwear
    @sansdieutechstreetwear 6 місяців тому

    Iiiiiiiiiiiiiiii