The Evolution of AI: Traditional AI vs. Generative AI

Поділитися
Вставка
  • Опубліковано 4 лип 2024
  • Build AI applications with watsonx → ibm.biz/BdmC5a
    AI powered tools have been used for decades, but the recent breakthroughs in generative ai have pushed the topic front and center, but what's really different about new generative ai models like large language models (LLM) compared to the traditional AI. In this video Shad Griffin explains the fundamental difference in their architectures that have made Generative AI possible.
    AI news moves fast. Sign up for a monthly newsletter for AI updates from IBM → ibm.biz/BdmC5G

КОМЕНТАРІ • 17

  • @amritbro
    @amritbro 2 дні тому +3

    Generative AI have loads and loads of data which is the prime reason for it for being so much powerful than that of traditional AI.

  • @rcstscc
    @rcstscc 3 дні тому +3

    Can we evaluate generative AI as "AI as a service" in this framework?

  • @RobertvMill
    @RobertvMill 2 дні тому +3

    AMAZING presentation. thank you

  • @christopherpetersen342
    @christopherpetersen342 2 дні тому +3

    sort of an odd description that didn't mention symbolic logic, rule-based systems, optimization problems, etc. historically. the feedback loop is about machine learning in the context of self-improving systems and doesn't necessarily imply artificial intelligence at all. and, it's not about holding lots of data lately, it's about unsupervised (usually) learning over massive data sets that aren't exactly stored in the model or the application layers you drew. no mention of transformers or foundational models?

  • @drS1lv3r
    @drS1lv3r День тому

    Great video! Thank you!

  • @simondian6380
    @simondian6380 18 годин тому

    As always well explained.

  • @yogeshbharadwaj6200
    @yogeshbharadwaj6200 2 дні тому

    Very nice explanation...made it easy to understand..Tks a lot...

  • @maikvanrossum
    @maikvanrossum День тому +1

    So basically ML models vs. LLM…? Because ‘traditional’ makes it sound like ‘classic’ or ‘vintage’ vs. ‘modern(ized)’ and ‘only new’. There are scenarios and use cases where the LLM runs inside or on-prem, or the dataset is scoped and therefore reduced. So yes, I can see the differences between the two architectures shown here, but this comparison I think is confusing.

  • @markfitz8315
    @markfitz8315 5 годин тому

    Thanks. 👍

  • @paolo.miscia
    @paolo.miscia 2 дні тому

    I found this very useful. Might be a bit... simplistic, but it really helps differentiating the two.

  • @Jeff-Lynn
    @Jeff-Lynn 2 дні тому +1

  • @Jagentic
    @Jagentic День тому

    A naïve question: with advances in quantum - and the specific q’s they’ve been able to demo astounding results - how far are we from perhaps an AI training run that could be formulated in such a way as to use quantum’s enormous efficiency and capacity - even if only for a specifically structured single-ish shot? Still a longshot?

  • @IcyAmphibian
    @IcyAmphibian День тому

    JASMY, built on IBM’s Hyperledger Fabric, combines generative AI, blockchain, IoT edge-network, and decentralized GPU services for AI computation. Imagine all of that Earth data in a secured and distributed IPFS, with NFT technology tying the data locker IPFS address to the user in an anonymous and private manner.

  • @Studyforsuccesstogether
    @Studyforsuccesstogether 2 дні тому

    How do you write like that? Or is it some trick to make the video like that?

  • @michaelpaczynski941
    @michaelpaczynski941 2 дні тому

    Thank you, but I’m not convinced. The fundamental architecture from their early days of the Turing machine still apply. The difference is where the repository is and the sophistication of the mathematics.

  • @thyagarajesh184
    @thyagarajesh184 2 дні тому

    #RETE #Forgy

  • @nil0bject
    @nil0bject День тому

    "AI is all the rage" lol wtf