Training LLMs with Synthetic Data (How NVIDIA Trained Nemotron)

Поділитися
Вставка
  • Опубліковано 15 лис 2024

КОМЕНТАРІ • 8

  • @WhatsAI
    @WhatsAI  Місяць тому

    Get your copy of "Building LLMs for Production": amzn.to/4bqYU9b

  • @chaithanyavamshi2898
    @chaithanyavamshi2898 4 місяці тому +2

    Very helpful and great explanation. Can you please make code demonstration for generating synthetic data with Nvidia nemotron?

    • @WhatsAI
      @WhatsAI  4 місяці тому

      Thank you! I could look into doing this yes :)

  • @DanFrederiksen
    @DanFrederiksen 4 місяці тому +2

    So did it work? you didn't seem to cover that.
    The premise seems flawed to me given how poor LLMs are at anything progressive

    • @thomasgoodwin2648
      @thomasgoodwin2648 4 місяці тому

      Until it's 'Star Trek' style, it's still all WIP. 😉

    • @DanFrederiksen
      @DanFrederiksen 4 місяці тому

      @@thomasgoodwin2648 ?

    • @WhatsAI
      @WhatsAI  4 місяці тому

      Oh wow, I completely forgot to share any table and results. Sorry about that. Since you can already use it I assumed (like GPT4) that most people would try it or heard of it. An error on my end, but you can see it there: build.nvidia.com/nvidia/nemotron-4-340b-instruct

  • @thomasgoodwin2648
    @thomasgoodwin2648 4 місяці тому +1

    It'll be interesting to see how far the models can go in extrapolating beyond the current state of human knowledge. Will they be genius' or a delusional crackpots? (Likely a bit of both given the source of the original data set... humans.)
    🖖😎👍