Confused which Transformer Architecture to use? BERT, GPT-3, T5, Chat GPT? Encoder Decoder Explained

Поділитися
Вставка
  • Опубліковано 3 чер 2024
  • This video explains all the major Transformer Architectures and differentiates between various important Transformer Models.
    Which Transformer Architecture to use to solve a particular problem statement in Natural Language Understanding (NLU) and Natural Languages Generation (NLG) is explained in a simplified manner.
    Over the past 6 years, Transformers, a neural network architecture, have completely transformed state-of-the-art natural language processing and the way we approach to different problem statements in NLG and NLU.
    Chapters:
    0:00 Introduction
    1:21 Encoder Branch
    1:57 BERT
    2:37 DistilBERT
    3:19 RoBERTa
    3:59 XLM
    4:50 XLM-RoBERTa
    5:32 ALBERT
    6:40 ELECTRA
    7:19 DeBERTa
    8:13 Decoder Branch
    8:50 GPT
    9:13 CTRL
    9:54 GPT-2
    10:31 GPT-3
    11:30 GPT-Neo/GPT-J-6B
    11:50 Encoder-Decoder Branch
    12:00 T5
    13:05 BART
    13:46 M2M-100
    14:22 BigBird
    #datascience #neuralnetwork #machinelearning #naturallanguageprocessing

КОМЕНТАРІ • 64

  • @datafuseanalytics
    @datafuseanalytics  Рік тому +7

    In this video, I tried to explain all the major Transformer architectures. I have also explained the differences and training objective of each one of them. If you feel this video adds value in your life then please like, share and comment on this video and subscribe to this channel. If any suggestions and feedback then please drop in comment box.

  • @aurkom
    @aurkom 9 місяців тому +5

    It would have been awesome if all the models had the release year mentioned along with it as well. Helps to get a birds eye view of the timeline.

    • @datafuseanalytics
      @datafuseanalytics  9 місяців тому

      Hello. Yes, I am making a separate video on similar topic. It will be uploaded soon. Stay tuned my friend.

  • @snehotoshbanerjee1938
    @snehotoshbanerjee1938 4 дні тому

    Great summary!!

  • @kevon217
    @kevon217 Рік тому +1

    thanks for the excellent, well-explained summary!

  • @santoshpanigrahi5711
    @santoshpanigrahi5711 Рік тому +2

    Thanks for sharing. It's very informative. Keep up with this work.

  • @hemantwani4757
    @hemantwani4757 25 днів тому

    Very nicely explained ❤👍

  • @ajitkumar15
    @ajitkumar15 Рік тому +1

    Very nice and to the point video, thank you !!!

  • @milindkubal2738
    @milindkubal2738 Рік тому +5

    Amazing. Great work👍

  • @SagarBhalke-td3vy
    @SagarBhalke-td3vy Рік тому +1

    Great explanation. Thank you very much

  • @falknfurter
    @falknfurter 7 місяців тому +1

    I just found this video and it's very good. I'm currently trying to understand when to use what type of model. Looking at Huggingface is just overwhelming. That's where this video jumps in and provides an excellent overview of the major models. I wish there would be a similiar video explaining the various pretraining objectives.

    • @datafuseanalytics
      @datafuseanalytics  7 місяців тому

      Hello. I will definitely make a video on the same. Thanks a lot. 😀

  • @sagar3482
    @sagar3482 Рік тому +1

    Informative content
    Thanks for sharing this

  • @SaketKumar-wy1wb
    @SaketKumar-wy1wb Рік тому +1

    This is good. Keep up the good work. 🙂

  • @ganeshkharad
    @ganeshkharad 9 місяців тому +1

    this is really nice explaination!!!

  • @sanjaybhalke8032
    @sanjaybhalke8032 Рік тому +2

    Thanks for sharing

  • @exxzxxe
    @exxzxxe Рік тому +1

    Well done!

  • @user-os1xi8qf4y
    @user-os1xi8qf4y 6 місяців тому +1

    thank you sir ! Fantastic method of explanation

  • @adityakshirsagar1391
    @adityakshirsagar1391 Рік тому +1

    Informative 👍

    • @datafuseanalytics
      @datafuseanalytics  Рік тому

      Glad it was helpful and informative for you Aditya. Please do share it with your friends. More interesting videos will be uploaded soon

  • @ahmedelsabagh6990
    @ahmedelsabagh6990 2 місяці тому +1

    Greate video!

    • @datafuseanalytics
      @datafuseanalytics  2 місяці тому

      Thanks a lot. Please do share it with your friends 😁

  • @sarc007
    @sarc007 Рік тому +1

    Excellent

  • @rembautimes8808
    @rembautimes8808 3 місяці тому +1

    Excellent video and I joined as a sub. Like this style of going thru the various architectures and the use case. Maybe you can also update it with GPT 4 since it’s new out there.

    • @datafuseanalytics
      @datafuseanalytics  2 місяці тому

      Thanks a lot for this amazing comment. I have uploaded the latest video using ChatGPT model - ua-cam.com/video/MKHEaxdoqxA/v-deo.html
      Please go through it and feel free to comment

  • @WillBeebe
    @WillBeebe Рік тому +1

    Superb 🎉

  • @d4munche3z
    @d4munche3z Рік тому +1

    Can you create a tutorial on Longformer and the concepts/code used to adapt an LLM for larger token sizes?

    • @datafuseanalytics
      @datafuseanalytics  Рік тому +1

      Hello David. I haven't made it yet. But I will definitely make one on Longformer etc which takes a whopping 4096 tokens as input. Thanks for your feedback.

  • @amortalbeing
    @amortalbeing 7 місяців тому +1

    thanks a lot❤

    • @datafuseanalytics
      @datafuseanalytics  7 місяців тому

      You are most welcome 😃 Do check other videos too on AI on this channel.

  • @projectbit2248
    @projectbit2248 Рік тому

    Hello, how do I contact/ connect with you, with regards to a project?

    • @datafuseanalytics
      @datafuseanalytics  Рік тому

      Hello, please contact us via our email. datafuseanalytics@gmail.com

  • @mhaya1
    @mhaya1 8 місяців тому +1

    Kudos🎉

  • @markfallu2389
    @markfallu2389 11 місяців тому +1

    Great summary- would be good if you did an update

    • @datafuseanalytics
      @datafuseanalytics  11 місяців тому

      Sure. I will make an updated video comprising of all the possible model architectures

  • @tilkesh
    @tilkesh Рік тому +1

    Thx

  • @ianboyles2425
    @ianboyles2425 Рік тому +1

    there's some new important ones like the newer gpt Neo models, alpaca, llama, cereus, vicuna

    • @datafuseanalytics
      @datafuseanalytics  Рік тому

      Hello Ian. Yes. At the time of this session, these models weren't available. Thank you for your feedback. I will definitely make one video (part 2) which will encompass these models in a more simpler fashion

  • @chenpeter7428
    @chenpeter7428 Рік тому +1

    It seems it does not cover BERT in computer vision.

  • @ko-Daegu
    @ko-Daegu Рік тому +3

    this sounds like copy pasted from online articles and just reading from them without extra info at all

    • @datafuseanalytics
      @datafuseanalytics  Рік тому

      Hey Ko-Jap. I referred multiple books for the same and then wrote the content in my language. But I did not refer to any online blogs or articles. Only books are the reference. But thank you for your valuable feedback. I will improve so that it doesn't sound as I am reading. 🙏😀

  • @yosup125
    @yosup125 5 місяців тому +1

    for the algo

  • @gregbugaj
    @gregbugaj 9 місяців тому

    Nice overview

  • @saketkr
    @saketkr Рік тому +2

    This is good. Keep up the good work. 🙂