BERT Neural Network - EXPLAINED!

Поділитися
Вставка
  • Опубліковано 28 лис 2024

КОМЕНТАРІ • 332

  • @CodeEmporium
    @CodeEmporium  Рік тому +10

    For details and code on building a translator using a transformer neural network, check out my playlist "Transformers from scratch": ua-cam.com/video/QCJQG4DuHT0/v-deo.html

  • @brilliantdirectoriestraining
    @brilliantdirectoriestraining 3 роки тому +146

    I have studied for several years AI, NPL and Neural networks. But the way you explained this was lovely, friendly and very simple which is why I am pretty sure you are BERT

  • @AmandeepSingh-xk4yv
    @AmandeepSingh-xk4yv 4 роки тому +249

    Just watched a video on Transformers, and now this. Am astounded at how you explained such complex notions with such ease!
    Hugely underrated channel!

    • @CodeEmporium
      @CodeEmporium  4 роки тому +7

      Thanks a lot! Glad you liked it 😊

  • @prasannabiswas2727
    @prasannabiswas2727 4 роки тому +1

    I read many blogs on BERT but they were more focused on how to use BERT rather than what actually BERT is?. This video helped me clear all my doubts regarding how Bert is trained. Clear and concise explanation.

  • @xJFGames
    @xJFGames 2 роки тому +6

    I'm honestly amazed of how you managed to transform a complex algorithm in a simple 10 minutes video. Much thanks to you, my final thesis appreciates you.

  • @krishnaik06
    @krishnaik06 4 роки тому +211

    Amazing Explanation :)

    • @doyourealise
      @doyourealise 4 роки тому +1

      big fan sir

    • @richarda1630
      @richarda1630 3 роки тому

      hear hear! so agree!

    • @User-nq9ee
      @User-nq9ee 2 роки тому +2

      To teach us , you study and explore .. really grateful for your efforts Krish .

    • @indgaming5452
      @indgaming5452 2 роки тому

      Where there is krish sir... I will come there. .... Sir I found u here also...
      We are learning together 😇

    • @itsme1674
      @itsme1674 Рік тому

      Nice to meet you sir

  • @healthertsy1863
    @healthertsy1863 Рік тому

    Don't hesitate, this is the best video of BERT explanation for sure!

  • @NadavBenedek
    @NadavBenedek 2 роки тому

    I love the "pass1" "pass2" concept of how you explain things. It's great.

  • @usamahussain4461
    @usamahussain4461 2 роки тому +4

    Phenomenal the way you condense such a complicated concept into a few minutes, clearly explained.

  • @jeenakk7827
    @jeenakk7827 4 роки тому +24

    I wish I had come across this channel earlier. You have a wonderful skill in explaining complicated concepts. I love your "3 pass" approach!!

  • @madhubagroy
    @madhubagroy 3 роки тому +1

    The BEST explanation on BERT. Simply outstanding!

  • @ziangtian
    @ziangtian Рік тому +1

    OMG!!!! This vid is a life-saver! just elucidated so many aspects of NLP to me (a 3 month beginner who still understands nothing

  • @rohanmirchandani9726
    @rohanmirchandani9726 3 роки тому

    This is one of the best resources explaining BERT available online.

  • @mauriciolandos4712
    @mauriciolandos4712 Рік тому

    Best explanator on youtube, you have a good mix of simplifying so it can be understood, but not overly simplifying so we learn deeply enough. The idea of having 3 passes going deeper was a great idea as well.

  • @michaelherrera4450
    @michaelherrera4450 3 роки тому +5

    Dude this video is incredible. I cannot express how good you are at explaining

    • @CodeEmporium
      @CodeEmporium  3 роки тому

      Thanks for watching! Super glad it is useful

    • @path1024
      @path1024 Рік тому

      Good try though!

  • @BogdanSalyp
    @BogdanSalyp 4 роки тому +1

    Extremely underrated channel, didn't find any other good explanation on UA-cam/Medium/Google

  • @pallavijog912
    @pallavijog912 4 роки тому

    Wow.. just switched from another BERT explained video to this.. stark difference.. excellent explanation indeed.. thanks..

  • @erfanshayegani3693
    @erfanshayegani3693 Рік тому

    Every time I watch this video I gain a better understanding of the procedure. Thanks a lot for the great content!!!

  • @AbdulWahab-cy9ib
    @AbdulWahab-cy9ib 4 роки тому +1

    Was struggling to understand the basics of BERT after going through Transformer model. This video was indeed helpful.

  • @rajeshve7211
    @rajeshve7211 6 місяців тому

    Best ever explanation of BERT! Finally understood how it works :)

  • @sumontasaha5638
    @sumontasaha5638 2 дні тому

    God Level Explanation. Thanks Man !

  • @ydas9125
    @ydas9125 Рік тому

    I am very impressed by the clarity and core focus of your explanations to describe such complex processes. Thank you.

    • @CodeEmporium
      @CodeEmporium  Рік тому

      You are very welcome. Thanks for watching and commenting :)

  • @mays7n
    @mays7n Рік тому

    this must be one of the best explanation videos on the internet, thank you!

  • @bidyapatip
    @bidyapatip 4 роки тому

    After reading lot of blogs, videos, I understood its so difficult network. But after going through this, I feel so easy to understand BERT(and its varient available in transformer library)

  • @akhileshpandey123
    @akhileshpandey123 Рік тому

    Always come back to your explanation whenever want to refresh bert concepts. Thanks for the effort.

  • @aeirya1686
    @aeirya1686 Рік тому

    Very very friendly, clear and masterful explanation. This is exactly what I was after. Thank you!

  • @goelnikhils
    @goelnikhils Рік тому

    Excellent Explanation. Main thing to note is the finer point around explanation/mention of the loss functions that BERT uses. As not many other videos on same topic cover this. Tood Good

  • @utkarshujwal3286
    @utkarshujwal3286 Рік тому

    The NSP task does not directly involve bidirectional modeling in the same way as masked language modeling (MLM) does. Instead, it serves as a supplementary objective during BERT's pre-training phase. The purpose of the NSP task is to help the model understand relationships between sentences and capture broader context beyond adjacent words.

  • @shrawansahu9500
    @shrawansahu9500 2 роки тому

    Wow , Best Explanation on BERT present in UA-cam that too Free, Thanks Man you made NLP easy.🍺

  • @JillRhoads
    @JillRhoads Рік тому

    Im a teacher with a compi-sci masters and am just diving into AI. This was absolutely great! I studied natural language in college about 20 years ago and this video really helped form a mental bridge with the new technologies. Language chatbots have been around forever!and the ones I studied used Markov chains that used small corpuses. So of course students would dump porn novels, the bibel and whatnot into it and then start it talking. We never laughed so hard!

  • @JP-gk6pi
    @JP-gk6pi 4 роки тому

    3 pass explanation is a really good approach to explain this complex concept. Best video on BERT

    • @beteaberra
      @beteaberra 3 роки тому

      Great video! But what is pass 1, pass 2 and pass 3?

  • @josephselwan1652
    @josephselwan1652 3 роки тому +1

    Amazing stuff. For visualization purposes, when you get into a deeper pass, I would recommend always adding the zooming effect for intuitive understanding. I am not sure about others, but when you do that, I instantly know "OK, now we are within this 'box' "

    • @CodeEmporium
      @CodeEmporium  3 роки тому +2

      Good thought. I'll try to make this apparent in the future. Thanks!

  • @andrewlai3358
    @andrewlai3358 4 роки тому +3

    Thank you for the explanation. You really have a knack for explaining NLP concepts clearly without losing much fidelity. Please keep posting!

  • @ankit_khanna
    @ankit_khanna 3 роки тому

    One of the best videos on BERT.
    Great work!
    Wishing you loads of success!

  • @pinakjani4281
    @pinakjani4281 4 роки тому +1

    No one explains DL models better than this guy.

  • @ParthivShah
    @ParthivShah 3 місяці тому

    Thank You. Love from India.

  • @maverick3069
    @maverick3069 4 роки тому +1

    The multiple passes of explanation is an absolutely brilliant way to explain! Thanks man.

  • @BiranchiNarayanNayak
    @BiranchiNarayanNayak 4 роки тому

    Best explanation i have seen so far on BERT.

  • @deepeshkumar4945
    @deepeshkumar4945 Рік тому

    dude , you are amazing , you explained the state of the art NLP model , in such a well explained and concise video . Thanks a ton for this video !!!!!!

    • @CodeEmporium
      @CodeEmporium  Рік тому +1

      You are super welcome. Thanks so much for commenting this!

  • @AIMLDLNLP-TECH
    @AIMLDLNLP-TECH Рік тому

    sentence: "The cat sat on the mat."
    BERT reads sentences bidirectionally, which means it looks at the words both before and after each word to understand its meaning in the context of the whole sentence.
    For example, let's say you want to know what the word "mat" means in the sentence: "The cat sat on the mat." BERT understands "mat" not just by itself, but by considering the words around it. It knows that a cat can sit on something, and "mat" here means a flat piece of fabric on the floor.

  • @somerset006
    @somerset006 Рік тому

    Nice job, man! Especially the multi-phase approach of explaining things, top to bottom.

    • @CodeEmporium
      @CodeEmporium  Рік тому

      Super happy you liked the approach. Thanks for commenting

  • @rahuldey6369
    @rahuldey6369 3 роки тому

    I've read 4 articles before coming here. Couldn't connect the dots. This single video showed me the way.. Thanks a lottt

  • @magelauditore333
    @magelauditore333 4 роки тому

    Such a underrated channel. Keep it up man

  • @andymoses95
    @andymoses95 4 роки тому +1

    The one which I was looking for the past 6 months.! Thanks a lot for making this.

  • @alidi5616
    @alidi5616 3 роки тому

    That's it. The best explanation i came through. Receive my upvote and subscription 😁

    • @CodeEmporium
      @CodeEmporium  3 роки тому

      Many thanks. Join the discord too :3

  • @aruncjohn
    @aruncjohn 4 роки тому

    Excellent explanation!. Will never miss a video of yours from now on!

  • @ilyasaroui7745
    @ilyasaroui7745 4 роки тому

    Good touch to put the references on the description instead of on the slides

  • @dannysimon2965
    @dannysimon2965 4 роки тому

    Wow, thanks!! I tried watching many videos and couldn't understand a single thing. But yours was truly concise and informative.

  • @gauravchatterjee794
    @gauravchatterjee794 3 роки тому

    Best Explanation by far!

  • @TSP-NPC
    @TSP-NPC Рік тому

    Thanks for the great explanation of Transformers and the architecture of BERT.

    • @CodeEmporium
      @CodeEmporium  Рік тому

      My pleasure and thank you for the super thanks :)

  • @adityaj9984
    @adityaj9984 Рік тому

    BEST EXPLAINATION EVEERRRR

  • @jan-rw2qx
    @jan-rw2qx Рік тому

    First half is exactly how much I need to understand right now, thank you :)

  • @amreshgiri
    @amreshgiri 4 роки тому

    Probably the best (easiest to understand in one go) video on BERT. Thanks ❤️

  • @jx7433
    @jx7433 8 місяців тому

    Excellent Explanation! Thank you!

  • @aitrends8901
    @aitrends8901 2 роки тому

    Very nice high level understanding of Transformers...

  • @abhikc8108
    @abhikc8108 4 роки тому +3

    Great explanation, I really like the three pass idea it breaks down a lot of complications to simple concepts.

  • @harshavardhany2970
    @harshavardhany2970 3 роки тому +1

    Simple and clear explanations (which shows you know what you're talking about). And cool graphics. Will be back for more videos :)

  • @muhannadobeidat
    @muhannadobeidat Рік тому

    Excellent introduction, visualizations and step by step approach to explain this. Thanks a ton.

    • @CodeEmporium
      @CodeEmporium  Рік тому

      You are oh-so welcome. Thank you for watching and commenting:)

  • @20.nguyenvongockhue80
    @20.nguyenvongockhue80 8 місяців тому

    wow AMAZING EXPLAINATION. Thank you very much

  • @arturjaroszewicz8424
    @arturjaroszewicz8424 8 місяців тому

    Beautiful explanation!

  • @seeunkim4185
    @seeunkim4185 2 роки тому

    Thank you so much for the clear explanation, I get the grip of the BERT now!

  • @NachoProblems
    @NachoProblems Рік тому

    Do you realize you are the only good description of how exactly fine tuning works I have found, and I've been researching for months. Thank you!!!

    • @CodeEmporium
      @CodeEmporium  Рік тому +1

      You are too kind. Thank you for the donation. You didn’t have to bug it is appreciated. Also super glad this content was useful! More of this to come

  • @aashnavaid6918
    @aashnavaid6918 2 роки тому

    thank you so very much! one video was enough to get the basics clear

  • @susantachary7456
    @susantachary7456 3 роки тому

    No more reading after this. Loved IT. 😊

  • @atamir8339
    @atamir8339 5 місяців тому

    loved your explanation bro, earned yourself a sub

  • @akashsaha3921
    @akashsaha3921 4 роки тому

    Well explained. Short and to the point

  • @LoLelfy
    @LoLelfy 3 роки тому

    Omg your videos are so good! So happy I found your channel, I'm binge watching everything :D

    • @CodeEmporium
      @CodeEmporium  3 роки тому +1

      Glad you found my channel too! Thank you! Hope you enjoy them!

  • @sabirahmed6191
    @sabirahmed6191 3 роки тому

    Firstly thanks for the really cool explanation. Would like to point out please remove the text animation as it causes a huge distraction for a few people, I had to watch this with multiple breaks cause my head was aching due to the text animation.

  • @MarketingLeap
    @MarketingLeap 4 роки тому

    Loved how you explained BERT really well. Great job!

  • @thepresistence5935
    @thepresistence5935 2 роки тому

    I studied 3 hours in book, but you completed in 10 minutes 😭. Super Explanation Thanksssssssssss

  • @mohsennayebi86
    @mohsennayebi86 2 роки тому

    Amazing Explanation! I am speechless :)

  • @ngavu8750
    @ngavu8750 2 роки тому

    So simple and easy to understand, thanks a lot

  • @pathuri86
    @pathuri86 2 роки тому

    Excellent and concise explanation. Loved it. Thanks for this fantastic video.

  • @Kumar08
    @Kumar08 4 роки тому

    Fantastic explanation, Covered each and every point in the BERT.
    Looking forward for more videos on NLP.

  • @sudharshantr8757
    @sudharshantr8757 2 роки тому

    At 7:55 , the position embeddings as said in the video encodes the position of a word in a sentence. But in the slide, the sequence of position embeddings if E0..E5,E6,E7..E10 instead of E0..E5,E0,E1..E5 (implying position embedding of a word depends on how the 2 sentences are arranged)

  • @themaninjork
    @themaninjork 3 роки тому

    Your Videos are very well explained !

  • @maryamaziz3841
    @maryamaziz3841 3 роки тому

    Thanks for wonderful explanation for bert architecture 🍀🌹

  • @blasttrash
    @blasttrash 9 місяців тому

    7:10
    If my question was "What is the color of apple?" and the expected answer is something like "The color of the apple is red".
    According to you at 7:10, bert will only output the "start and end" word of the answer which in this case would be "The" and "red". Just by these two words, how will I get the answer back to my question "What is the color of apple?"

  • @keyrellousadib6124
    @keyrellousadib6124 2 роки тому

    This is an excellent summary. Very clear and super well organized. Thanks very much

    • @CodeEmporium
      @CodeEmporium  Рік тому

      Thank you so much for watching ! And for the wonderful comment :$

  • @josephpareti9156
    @josephpareti9156 2 роки тому

    awesome introduction to a very challenging topic

    • @CodeEmporium
      @CodeEmporium  2 роки тому

      Thank you. Uploading a related video on this soon too :)

  • @urd4651
    @urd4651 2 роки тому

    thank you for sharing the video! very clear and helpful!!

  • @sathyakumarn7619
    @sathyakumarn7619 3 роки тому

    Not bad! Loved the video. Please add a little bit of more explanation for upcoming vids if preferrable.

  • @antoinefdu
    @antoinefdu 3 роки тому +1

    @CodeEmporium I have a question for you. Imagine the sentence :
    "My dad went to the dentist yesterday and he told him that he needs to floss more."
    Can BERT understand that in this context "he" is probably the dentist, and "him" my dad?

    • @CodeEmporium
      @CodeEmporium  3 роки тому +2

      Great question. It's actually a studied problem in NLP called "cataphoric resolution" and "anaphoric resolution". I have seen examples of it online, so i think we should be able to do this to some degree.

    • @lumos309
      @lumos309 3 роки тому +2

      It's actually even more complex than you mentioned, since the first "he" is the dentist but the second "he" is your dad! It would be fascinating to see if this can actually be done.

  • @ruyanshou805
    @ruyanshou805 3 роки тому

    Well explained! I have been looking for something like this for quite long!

  • @AbdulWahab-cy9ib
    @AbdulWahab-cy9ib 4 роки тому

    In the video, you mentioned that during training C is binary output but in the paper is it mentioned that it is a vector.

  • @arvindvasudevan45
    @arvindvasudevan45 4 роки тому

    Excellent articulation of the concept. Thank you.

  • @sajaal-dabet148
    @sajaal-dabet148 3 роки тому +1

    This is amazing! Crystal clear explanation, thanks a lot.

  • @felixakwerh5189
    @felixakwerh5189 4 роки тому

    great video, i like the 3-pass method you used to explain the concepts

  • @MarvinEckhardt
    @MarvinEckhardt 3 роки тому

    I think there is a small problem: You said it understood language. But if i say something like "turn off the screen" or "its dark here", will it ever be able to really understood this and really turn off my monitor or turn on the lights? It has only the sequence of words but it doesnt know how to use it in real world problems. it needs to be combined with other neural networks and specialized hardware to give answers and perform actions in real world. It needs to understand the situation using things like cameras and other sensors to generate an action that is requested

  • @nooshnabi9248
    @nooshnabi9248 3 роки тому

    Great Video! So easy to follow!

  • @johngrabner
    @johngrabner 2 роки тому

    Excellent explanations. One question: For transformer-based translation, you feed one word at a time to get the next word. I assume you mean you concatenate the last output to the input to get the next word in the sentence. Is this the same for BERT-based translation? Without this, I would assume that the nth word of the output would be the superposition of all possible outputs.

  • @abhinavsukumarrao4039
    @abhinavsukumarrao4039 3 роки тому

    Correct me if I'm wrong, but I don't think you can say that BERT is a language model, since you're taking the probabilities of the masked words alone right? Also, don't the WordPiece embeddings make it output stuff that wouldn't make sense anymore for a language model?

  • @rashmikuchhal5339
    @rashmikuchhal5339 4 роки тому

    I always watch your videos and appreciate the efforts you put to make the complicated topics so easy and clear. Thankyou for all your work. I really like the way you explain in 3 passes.... great work

  • @kogocher
    @kogocher 3 роки тому

    explained the concepts clearly

  • @Deddiward
    @Deddiward 3 роки тому

    Wow I've just discovered your channel, it's full of resources, very nice!

  • @caoshixing7954
    @caoshixing7954 4 роки тому

    Very good explanation, Easy to understand! Come on!

  • @쉔송이
    @쉔송이 4 роки тому

    Hey, your explantation and presentation on complicated concepts made me clear about TF and BERT.
    I will expect you to upload more exciting videos.

  • @geekyprogrammer4831
    @geekyprogrammer4831 Рік тому

    very underrated channel!

  • @user-gx3ib9hi2o
    @user-gx3ib9hi2o 9 місяців тому

    Thanks a lot for the super clear explanation! Are your slides available by any chance (for reuse with attribution)?

  • @dupontremi5638
    @dupontremi5638 Рік тому

    great explanation, I understood everything ! thanks a lot

    • @CodeEmporium
      @CodeEmporium  Рік тому

      Thanks so much for watching and commenting :)