Smarter Reasoning w/o RAG: SOLUTION for Short-Context LLMs

Поділитися
Вставка
  • Опубліковано 2 січ 2025

КОМЕНТАРІ • 128

  • @code4AI
    @code4AI  3 години тому

    With the automatic audio dubbing from UA-cam /Google you hear a synthetic voice in your regional language.
    To hear my original voice in English, switch to "Default" or "English" in the settings. Thank you.

  • @debn_bey-jj9lq
    @debn_bey-jj9lq 3 дні тому +26

    I think you have the happiest greeting on the internet, you're doing excellent work and you've got a knack of keeping me motivated. Looks like you've got a very demanding audience, but I think your your approach and execution are "beautiful" and exactly what we need. Happy New Year dude from an old hand (who's dabbled in AI over the past 30 years)

  • @jswew12
    @jswew12 3 дні тому +26

    I personally love your videos! Never had a problem with your pronunciations either, seems like a weird critique. I think these are great introductions to the topics you discuss: They are normally the right level of depth to where I can get a general understanding of what the benefits/implementation of an approach are and then I can go read the source papers if I find the approach interesting or applicable to my problems. Thanks for making these!

    • @code4AI
      @code4AI  2 дні тому +2

      Perfect fedback. Thank you.

    • @actorjohanmatsfredkarlsson2293
      @actorjohanmatsfredkarlsson2293 День тому

      I agree. Personally I can have some difficulty following, especially when the math and acronyms are flying but it nothing I can figure out by rewinding or checking with an AI-chat. I like the level of complexity.

  • @ambujkumar7215
    @ambujkumar7215 3 дні тому +17

    Dont worry about these comments man. You are a rockstar! You are doing all the hard work of finding relevant literature and helping us understand key concepts. I can just sit in my living room, have a cup of tea, relax, and catchup on latest AI literatures through your videos. That too for free. I think its amazing.

    • @code4AI
      @code4AI  2 дні тому +2

      Great comment. Thank you.

  • @user-pt1kj5uw3b
    @user-pt1kj5uw3b 3 дні тому +13

    Thanks for all the great videos. I think a lot of people didn't watch that whole video. This space is full of people who think they know more than they do, and they seem to be very loud. Also your pronunciation is fine and so is the length, and the powerpoint presentation style is far better... Please never stop making long videos, don't change anything! You are the only competent person exploring and making videos about papers on the frontier, and I have already gotten several great ideas from your videos or found papers I didn't know about.

    • @code4AI
      @code4AI  2 дні тому +2

      Highly appreciate your comment. Thank you.

  • @kevon217
    @kevon217 3 дні тому +10

    I find your style and tone quite delightful. Haters always gonna hate.

    • @code4AI
      @code4AI  2 дні тому +2

      Thank you. Appreciate it.

  • @gabrielkeith3189
    @gabrielkeith3189 3 дні тому +13

    Love this channel, don't change a thing.

    • @code4AI
      @code4AI  2 дні тому +3

      Great comment. Will do. Thank you.

  • @CharlesOkwuagwu
    @CharlesOkwuagwu 3 дні тому +15

    it has been a great year of content from you sir. thanks for the videos

    • @code4AI
      @code4AI  2 дні тому +3

      So nice of you

  • @diga4696
    @diga4696 2 дні тому +7

    UA-camrs like yourself inspire me everyday to go and look over 10 to 15 abstracts a day! Your inspiration is growing the collective intelligence, criticism will always exist.
    Thank you for all your hard work! Happy new year! 🎉

    • @code4AI
      @code4AI  2 дні тому +1

      Happy New Year and Thank You!

  • @potential900
    @potential900 3 дні тому +22

    What the heck, your current video format is fine and accessible. Don't change it. Definitely don't add stock video for people with 3 second attention spans.

    • @code4AI
      @code4AI  2 дні тому +3

      Smile. I do like your comment ...

  • @ChaseFreedomMusician
    @ChaseFreedomMusician 3 дні тому +5

    I really appreciated your latest video. Your ability to break down these dense concepts is always impressive, and I think the way you tied Cache-Augmented RAG to previous ideas like Infinity Attention is particularly insightful. Watching this got me thinking about how the two mechanisms could complement each other in practical applications, so I wanted to share some thoughts.
    Infinity Attention’s compressive memory system, which retains high-dimensional representations of past key-value states without relying on softmax, could integrate seamlessly with a cache-augmented setup. By allowing efficient retrieval and reuse of compressed contextual embeddings, it enables a more adaptive and nuanced interaction between cached and external retrieval sources. Specifically, combining precomputed, dense vector embeddings from the cache with newly retrieved context could enhance both the robustness and the contextual relevance of the outputs.
    This pairing could be particularly valuable for tasks involving smaller datasets or domain-specific examples. By leveraging Infinity Attention’s ability to retain and blend contextual information across long input sequences, we could build systems that not only perform well with limited data but also dynamically adapt to varied retrieval scenarios. Of course, there are constraints-such as balancing the representational capacity of compressed memory and ensuring information fidelity-but techniques like rotational embeddings or other dimensionality-preserving transformations might mitigate these issues.
    Your video sparked this train of thought, and I think there’s a lot of potential in exploring how these mechanisms can work together beyond their individual strengths. I’d love to hear your perspective on this, especially since you’ve done such a great job highlighting their foundational ideas. Thanks again for putting this out-it’s always a pleasure to learn from your work.

  • @snow8725
    @snow8725 3 дні тому +7

    Thank you for explaining the detail a bit more than most people will, it's nice to understand how things work when you care about more than just the result. People can always just copy paste the abstract and the conclusion of a paper without reading it, into chatGPT, and say "summarize this" if they don't wanna know.

  • @KitcloudkickerJr
    @KitcloudkickerJr 3 дні тому +7

    Don't change anything. i love the channel as it is

    • @code4AI
      @code4AI  2 дні тому +1

      Thank you so much.

  • @anubiseyeproductions2921
    @anubiseyeproductions2921 3 дні тому +6

    Haters going to hate. You are loved & appreciated. Anyone in the know, understand you know.

  • @BeMoreDifferent
    @BeMoreDifferent 3 дні тому +5

    Congratulations to an extremely successful year with an unbelievable amount of some of the best content in the AI environment in the world. Enjoy your start into the new year, and I'm looking forward to learning much more in the next year from you.

  • @carson1391
    @carson1391 3 дні тому +4

    This was funny. I like your videos. Always a must watch. I feel like we both have a lot in common as far as perspective

  • @tantzer6113
    @tantzer6113 2 дні тому +4

    I am an outsider to data science, not even a novice. Your channel has become the only one on UA-cam that I watch daily and religiously. I like the fact that your content is more advanced than where I am. It leads me to look up things if I want to understand something better. For example, I spent an hour in conversation with Perplexity AI to understand the algorithms and methods behind CAG. If you take the time to create lengthy tutorials for novices, you might not be left with enough time and energy to present the cutting edge of research as you have been doing. Besides, UA-cam is already full of tutorials and introductory lectures, but few people do what you have been doing with as much consistency and clarity. I prefer you stay true to the topics, and to your style, rather than dumb things down beyond recognition for the general public. By the way, I plan to watch your old videos, too, since they seem to cover many important areas and essential background information. Please don’t insert videos, unnecessary jokes, or anything else that might distract from your topic; your presentation style is perfect already. Your accent is great, too. People will say many unhelpful things in the comments. That’s part of the territory. Think of the fact that many are appreciative. And, finally, thank you for teaching us, and teaching so well.

    • @code4AI
      @code4AI  2 дні тому +3

      I do highly appreciate your comment. Thank you.

  • @consistent1
    @consistent1 3 дні тому +3

    Happy New Year, Community! :) Thank You for the great videos and for your dedication. Your simple examples allow me to absorb many abstractions while still, as you described in this video, "making sure we are on the same page here." You might say they provide a KV store in a simple schema I can use with my tiny context window.
    BTW, the second method (of augmenting the store) creates a persistent data structure (if you squint a little).
    One might use a functional programming language (say, Clojure) to implement the first method using PDS and get two for the price of one, both memory optimization and caching. You may get compositionality for free. While at it, you can store your data in an HD vector. Hash all the things in a content-addressable manner, and voilà. Now, you can populate a vector subspace that is self-describing to the LLM. At least, that's the theory. You still have to ask the LLM to interpret the user's input according to the prepopulated subspace, but I bet if you ask nicely, it will indulge you.

  • @cjbarroso
    @cjbarroso 3 дні тому +5

    You are the best, i watch all your videos. Happy new year!

    • @code4AI
      @code4AI  2 дні тому +1

      Happy New Year!

    • @cjbarroso
      @cjbarroso День тому

      @code4AI the best for you

  • @iwswordpress
    @iwswordpress 3 дні тому +5

    Thanks for reading and explaining complex new ideas!

    • @code4AI
      @code4AI  2 дні тому

      Glad it was helpful!

  • @josepht.2491
    @josepht.2491 3 дні тому +5

    Love all the content.

  • @OccamsPlasmaGun
    @OccamsPlasmaGun 3 дні тому +3

    A youtube video can have a lot of different kinds of viewers and those that find it valuable will come back. Seems like a simple solution for sorting audience among various creators.
    I share your excitement about new tech and elegant solutions to issues that arise in this field. Keep your enthusiasm!

  • @gunterstrubinsky9452
    @gunterstrubinsky9452 3 дні тому +4

    being myself an austrian and having been faculty at a us university with an accent similarly 'bad' to dr. ruth westheimer's i found that most people care for the content and not the pronunciation.

  • @MegaClockworkDoc
    @MegaClockworkDoc 3 дні тому +5

    I like your English. Don't worry about the trolls, they are likely bots.

  • @dmytroaleinykov4088
    @dmytroaleinykov4088 2 дні тому +1

    Hi, thanks for diving in this interesting topic, and your pronunciation is actually very good in my opinion!
    It is an interesting topic for me, but still not so easy to understand the idea.
    Would you mind making one more explanation for dummies with maybe a visual representation? It could be easier for visual learners.

  • @covertassassin1885
    @covertassassin1885 3 дні тому +1

    Great video!
    I think the concept of this externalized memory for the "Small-context length LLM" or "Agent" is very similar to an Agent "Scratchpad" for working memory. However, now it is externalized so it can be retrieved from, added to, and otherwise modified.
    The difference from the title is that RAG could still be performed on that external memory store if it grows too large to fit into the context window.
    However, you still get the benefit of KV Cache if most of the relevant memories retrieved are consistent for multiple LLM calls resulting in KV Cache hits

  • @ladonteprince
    @ladonteprince День тому

    you promote new knowledge!!!! I'm dead! Wow.... That one broke my mind. Discover I'm from America and was an English teacher in china for businessmen and children. Your English is perfect. Your personality is excellent. Thank you for your joy in your voice when you present these solutions that can sometimes appear or be boring to understand.

  • @smicha15
    @smicha15 День тому +2

    Yeah, I enjoy your videos, but your critics have a point... I'm an industrial designer that pivoted into AI adoption strategy for a financial firm. I really want this company to have the best AI, and your channel is really critical of all technologies. unless you really invest in providing examples of new techniques using financial data, or market research, or something technical like in manufacturing, your audience will have a lot of trouble trying to understand how this new methodology actually works FOR THEM. you clearly have mastered all this technology, but it's also clear how much you gloss over. what I would recommend is for you to load all the comments from recent videos, and load your scripts into an LLM, and ask what it thinks is missing that would give your critics some satisfaction... also, process diagrams are really useful.

  • @btscheung
    @btscheung 2 дні тому

    You're my best AI knowledge resources on the internet! I love your in-depth guidance on the topics, review of the most advanced paper and analytics guidance on how to approach the new ideas. I always refer you to my friends and collegiate as "Austrian Professor"! I am truly educated by your videos. Please continue providing these no glimmic, no commercial agenda of knowledge transfers! Love your channel.

  • @Spreadshotstudios
    @Spreadshotstudios 2 дні тому +1

    I think a better explanation of why ditch RAG is this: If you just need to load knowledge into an Agent, CAG is a winner. But if you need to embed the MEANING of something and compare it to other meanings (i.e. dot product of vectors) then you still need the features that RAG operates on (embeddings, vectors and comparisons). Unless I miss something, why stop using vector tech entirely?

  • @UserB_tm
    @UserB_tm 2 дні тому

    You are so gracious with the snide comments. I am a beginner so l appreciate and need explanation of what seasoned developers instinctively know. Love all your videos and especially your optimism and enthusiasm.

  • @theatheistpaladin
    @theatheistpaladin 2 дні тому +1

    Keep up the good work. Haters are just loud, not the majority.

  • @animeplustrance
    @animeplustrance 2 дні тому

    I listened to one video on SimGRAG on this channel and instant subscribe---friendly, enthusiastic, explanatory.
    I listen when I work out, on the go, all the time---like a podcast and constantly immerse myself in the way you present knowledge.

  • @Bill-Sci
    @Bill-Sci 2 дні тому

    Your strong passion and curiosity for this topic is very contagious. It is absolutely wonderful! Don't change a thing. If the way you are doing things now is maximising your passion, people can't help but feel inspired to learn.

    • @code4AI
      @code4AI  2 дні тому

      Thank you! Will do!

  • @alexjensen990
    @alexjensen990 2 дні тому +2

    Well, this video made me feel both angry and sad. I know that wasn't your intent, but the way that people behave online is simply the worst that humans have to offer more frequently than should be tolerated. I, for one at least, have learned so much from your videos and they have inspired me to, not only research further, but to switch professional fields entirely from analytical chemistry to data engineering and ML/DL. For what it's worth your simple explanations and examples lowered the point of entry for me and in my advanced age encouraged me pull myself out of a career that was long past soul sucking....and with not a moment to lose. 20 years of analytical chemistry, a few years to learn how to program and data analysis from more or less ground zero, and hopefully finishing up with 20 years of invigorating and rewarding work in dev and analysis. So, from the middle of America, I want to say thank you. Your efforts have changed my life... And I don't know how good my English is, but I know for sure that my German is awful!

    • @code4AI
      @code4AI  2 дні тому

      What a powerful comment. Thank you so much.

  • @alpaykasal2902
    @alpaykasal2902 2 дні тому

    You can't please everyone. Keep up the great work. And thanks for being humble. You are appreciated.

  • @acestapp1884
    @acestapp1884 День тому

    keep kicking ass, don't let the haters get you down. i appreciate your explanations and breakdowns. Thanks for doing analysis and connecting ideas instead of just reading the paper.

  • @RK-AU
    @RK-AU 2 дні тому

    Thanks for all the videos. I’ve been watching the channel most days for over a year and I really appreciate the consistent effort. Nobody is obligated to keep watching, but thousands do keep watching so something is right. As an amateur, the content is pitched at the right level or above my standard and THAT IS BEAUTIFUL! It’s up to me to take the time to develop my knowledge. That’s why I’m here. Keep up the good work.

    • @code4AI
      @code4AI  2 дні тому +1

      I appreciate that!

  • @mohamedkarim-p7j
    @mohamedkarim-p7j День тому +1

    Thank for sharing👍

  • @Alorand
    @Alorand 2 дні тому +1

    Your last video was my introduction to the topic. I am interested in this, but I am still behind on most of the terminology.

  • @pierrefournier8100
    @pierrefournier8100 День тому

    Keep cool and carry on ! Do not let people tell you what to and how to do, please do share with pleasure as you're used to to do. Happy new year !

  • @joser100
    @joser100 День тому

    Wow! just listening to your references to Haters comments... You have here an unconditional follower, sure, you have a slight German access (ok, Austrian), but your English is absolutely perfect other than for the accent, and to be honest, I like the accent and everything... Not sure this is for beginners though, ok, you do reference to green-hopers (or did in the past) but for the most part I can guarantee that your background on maths is well beyond beginner level. I love it, even if it is way passed my level at times, you have all my attention, and I am sure, the attention of thousands more, no matter the few haters...

  • @andrewowens5653
    @andrewowens5653 День тому +1

    Thank you for putting the boneheads in their place. Now let's get on with the inspiration... !

  • @wwkk4964
    @wwkk4964 3 дні тому +1

    I love your English! 🎉

  • @davidwynter6856
    @davidwynter6856 2 дні тому

    You provide an invaluable service to me. I used to trawl through lots of papers myself, but your interests align closely with mine. So you save me a lot of time. The only complaint I have is that you waste your precious time responding to the complainers. Let them go elsewhere if they don't like something! So keep it up, accent and all, there are plenty who love it just the way it is.

    • @code4AI
      @code4AI  2 дні тому

      Thank you so much.

  • @BradleyKieser
    @BradleyKieser 2 дні тому

    Thank you for everything that you did this year. Absolutely love your videos.

  • @rajeshkr12
    @rajeshkr12 2 дні тому

    It doesn't matter; your content rocks. I love to watch your content, and the objective to cater from novice to expert is very tricky. Whatever you share, its awesome.keep doing and HNY 2025

  • @kevinpham6658
    @kevinpham6658 2 дні тому

    Hey, I love your videos. They are genuinely useful and I frequently send them to people to help them understand papers. You have a gift in education for sure-- not sure you have to modify your approach at all. There are haters everywhere. Ignore them.

    • @code4AI
      @code4AI  2 дні тому

      Appreciate your comment.

  • @MetaMeta-ic1wr
    @MetaMeta-ic1wr 2 дні тому

    Happy new year!!! 🎉🎉 Thank you for this amazing video. I now have much better understanding of that you were talking about in last video. Thanks so much. Pleas to continue to make videos like this! (wtf. Who are you? How do you understand the papers and are able to break it down so nicely. Thank you for new ideas. I guess you must be really busy in your life so thank you for making time to do this. Only for our benefit.)

  • @johncarloslorieta1443
    @johncarloslorieta1443 2 дні тому

    Man don't change the format, the video creates a formal format, that really explains everything

  • @BradleyKieser
    @BradleyKieser 2 дні тому

    Those commentators are wrong. Love the format and style of your videos. Don't change.

  • @tnypxl
    @tnypxl 2 дні тому

    Dude, I love your videos. If I want to learn about implementing RAG myself, its not your videos I'm watching for that anyways. I love hearing about research from the field because there is TON OF IT coming out all the time.

    • @tnypxl
      @tnypxl 2 дні тому

      Don't change a thing!

    • @code4AI
      @code4AI  2 дні тому

      I appreciate that!

  • @nicosilva4750
    @nicosilva4750 3 дні тому

    I am an American and can understand EVERYTHING you say. You speak perfectly fine.

  • @acasualviewer5861
    @acasualviewer5861 2 дні тому

    About memories:
    I don't understand the last step. Once you have this long memory, how do you get it to compute an answer using the entire memory?
    Won't the memory exceed the context size?
    I'm confused about that.

  • @billybofh2363
    @billybofh2363 3 дні тому

    Have a great new year! Thanks for all the work you've put in to your videos - very appreciated! Though you could sprinkle more cute cat pictures in them.... ;-p

  • @patruff
    @patruff 3 дні тому +3

    I tried CAG it wasn't that great. I don't like how it's model specific, RAG seems more modular in that you can use the embeddings for any model.

  • @wct88
    @wct88 5 годин тому

    Your style is great, simple and to the point.
    Pls don't use the stock video styles like others.. these stock videos are useless, boring and everyone on UA-cam are repeatedly using the same stock videos.

  • @chrismangun
    @chrismangun 2 дні тому

    It feels like you are leaning into future state architecture lately. I'd be curious what your opinion on Sparse Attention Mechanisms as the logical next step for the roughness, token-less, and colloquial user needs. The market is lifting up the need in interesting directions in this emergent space with new constraints. A multi-layered hybrid attention framework combining Sparse, Dynamic, Hierarchical, and Cross Attention is best suited for solving emerging cross use case needs. Architecture that offer scalability, adaptability, and robust multimodal reasoning, critical for achieving cross user goals of generalization and versatility across domains. Continuous learning mechanisms could enhance this framework, w adaptability to new data and tasks. // PS You've been an incredible inspiration and teacher. Never spend any more time on troll comments. Our time is too precious to provide social therapy for those who haven't taken the time to socialize their person. I appreciate you and the candle light of reason and curiosity that is carried on from share to share.

  • @mtprovasti
    @mtprovasti 2 дні тому

    First thing that came to mind from the Last video was the context lenght, luckily I made it to the 19 min Mark. The top most thought about this video: "would it be helpfull to Load the initial memory state from a vector space or kg anyways and after the process maybe offload the result to it? Similar to long time and working memory works.

  • @andydataguy
    @andydataguy 2 дні тому

    I'm glad that you design your videos your way. Great to see your wholesome approach to responding to criticism! 🙌🏾💜

  • @Silberschweifer
    @Silberschweifer День тому

    whut 128k contxt windo are small?
    the most small model local offer 4k contextwindow...

  • @maertscisum
    @maertscisum 2 дні тому

    What? I love your way of explaining things and it does spark interest. Don't change your approach for those selfish individuals. They can find other channels that meet their expectation or hell, they can start their own channel. Leave us alone.

  • @ricosrealm
    @ricosrealm 3 дні тому

    Don't change anything, continue to do what you feel is the best.

    • @code4AI
      @code4AI  2 дні тому

      Will do. Thank you.

  • @Swooshii-u4e
    @Swooshii-u4e 3 дні тому

    Have you received any sponsorship offers before and if so how many

  • @haroldpierre1726
    @haroldpierre1726 2 дні тому

    You are such a humble man. I would be upset by the negative comments from the Darens and Karens who are so ungrateful for your videos. Most of those comments were not constructive. I always look forward to learning something new from your videos.

    • @code4AI
      @code4AI  2 дні тому

      I appreciate that!

  • @carlosforero8455
    @carlosforero8455 День тому

    Thank you, your content is too valuable. Can you please share aome repository which shows the application of this new rag method please

  • @oyajiru
    @oyajiru 2 дні тому

    Some people are just salty you shattered their dream of going from RAG to riches 😊
    Your videos are amazing. You have replaced all the other channels I was watching because you provide a fun way to digest the cutting edge of the field.
    Don't worry about the people grumbling, it's normal for channels that reach your size. It will slowly get drowned out by a growing audience that you do connect with.

  • @ArianeQube
    @ArianeQube 3 дні тому +2

    I love how every video says goodbye RAG, but RAG still remains the best option for the vast majority of applications.

  • @madarauchiha2584
    @madarauchiha2584 2 дні тому

    I like your videos so much. Don't let some comments dishearten you

    • @code4AI
      @code4AI  2 дні тому

      Thank you so much 😀

  • @MrTachy0n
    @MrTachy0n 2 дні тому

    Lol I watch your videos partly for accent and to get outside of American mindset and into Germanic lol .. thanks brotha.

  • @JimMassey-g7z
    @JimMassey-g7z День тому

    Maybe structure your video using the Pyramid Principle so the smug 'experts' can hop off early. I appreciate your explanations ... and if I don't I skip the parts I don't want to view. Keep up the great work! I like your videos and test what you discuss if I find it worth reviewing.
    Some videos could be shorter, but skipping is easy enough, so I don't understand the haters.

  • @houwenhe4748
    @houwenhe4748 2 дні тому

    I love your contents. ❤

    • @code4AI
      @code4AI  2 дні тому

      So great to see your comment.

  • @acasualviewer5861
    @acasualviewer5861 2 дні тому

    I don't understand these critiques.. what makes your videos good are the detailed explanations.

    • @code4AI
      @code4AI  2 дні тому +1

      Appreciate your comment. Thank you.

  • @borisguarisma8810
    @borisguarisma8810 День тому

    Please don't change! Keep the good work and stay true to yourself. Ignore comments from lazy people you want to standardize the way to communicate on UA-cam

  • @augmentos
    @augmentos 2 дні тому

    Happy New Years!! 👀👀👀 haha you have GREAT English my guy! In fact you have a kind charm and the far from horrible! Haha these comments are insane maybe I never read many others. You are legit a G in the space these haters don't even know. Do NOT make 1-2 min videos (or just don't stop making the long in-depth videos that is what I like). I literally follow probably more AI channels than 99.9999% of people and speed watch a TON. You are a cut above ser. Happy New Years!!!

    • @code4AI
      @code4AI  2 дні тому

      Happy New Year to you.

  • @reynoldoramas3138
    @reynoldoramas3138 2 дні тому

    I can´t understand the poor reasoning behind those comments, is shamefull. Keep doing your videos, they are awesome.

  • @scottmiller2591
    @scottmiller2591 11 годин тому

    A good paper review requires the participants to at least go back and read the literature to clarify their questions, before complaining about the review. Some people want to be spoon-fed - that's not a paper review. It's fair to ask for an example, though.
    The framework PRISM presented here is dangerously close (some would say identical) to the GOFAI (Good Old-Fashioned AI) approach called "warehouse of rules," where the base rules live in the schema and update methods, the derived rules in the JSON/etc. store, and using an LLM to start the base at a high level of abstraction. It doesn't quite rise to the level of what is usually thought of as neural-symbolic processing, in the sense that neural weights aren't updated, but is an interesting compromise. It also looks like it could be used to produce a kind of interpretable result, in the sense that the rules in the schema, update methods, and stores are all open to examination and human-interpretable, although still suffering from the complexity of "warehouse of rules" if a large corpus is used to prepare the store.

  • @HoldMyData
    @HoldMyData 3 дні тому

    Man, you really hate RAG. xD Happy New Year :D

    • @code4AI
      @code4AI  2 дні тому +1

      Happy New Year - with much less RAG !

  • @Swooshii-u4e
    @Swooshii-u4e 3 дні тому +1

    Lol and I am over here wishing you can be more simpler

  • @simplegeektips1490
    @simplegeektips1490 2 дні тому

    And I did a comment on CAG... still waiting for your reply

  • @TaddeusBuica
    @TaddeusBuica 2 дні тому

    Do your thing bro, loving your videos, don't mind the haters

  • @vicaya
    @vicaya 3 дні тому

    The people who were criticizing were probably selling conventional vector dbs 😀. Keep it up. But do have a clear separation what the literature says and the hallucinations of your LLMs and yourself.

  • @robertladwig6075
    @robertladwig6075 2 дні тому

    Do your videos the way you want, do not pay attention to the negative comments, if they want something different UA-cam is an open platform, they can do what they want the way they want to.

    • @code4AI
      @code4AI  2 дні тому

      Thank you for this comment.

  • @45Thyoutube
    @45Thyoutube 23 години тому

    This story so funny and like my life
    🤣🤣
    I need to go travel with family

  • @rascalwind
    @rascalwind 2 дні тому

    Accent is awesome.

  • @simplegeektips1490
    @simplegeektips1490 2 дні тому

    Your English is super good! Just try to ask an American or a British to speak another language...😂

  • @45Thyoutube
    @45Thyoutube 23 години тому

    Like a you said to me
    It my life 🤣🤣

  • @acasualviewer5861
    @acasualviewer5861 2 дні тому

    I don't get the criticism of your pronunciation of "RAG".. to my American English ears it sounds just fine..
    Better we focus on the topics, not the irrelevant details. I like your videos.

    • @code4AI
      @code4AI  2 дні тому +1

      Thank you for this comment.

  • @АлександрВернигоров-э7н

    Не парься чел

  • @deauvelli
    @deauvelli 9 годин тому

    don't listen to the consumers of the " this is THE AI game changer" - 2min Videos with stock footage and "that thumbnail". But you could add more programming and usecase content,

  • @densonsmith2
    @densonsmith2 2 дні тому

    Going for small, open source models is a path to nowhere. The cost of the proprietary models is tiny compared to the value and they are rapidly becoming cheaper and more powerful. How cheap does a wrong answer need to be in order to use a cheaper model instead of the correct answer from a better model?

  • @45Thyoutube
    @45Thyoutube 23 години тому

    This story so funny and like my life
    🤣🤣
    I need to go travel with family