Is Meta’s new AI really open source?

Поділитися
Вставка
  • Опубліковано 24 вер 2024
  • Facebook/Meta have been big open source proponents for awhile, but the investment into open source AI is unrivaled. Super excited by what they're cooking, even if "open source" might not be the best term. Something something Llama 3.1
    SOURCE about. n...
    REFERENCES
    • The Artificial Intelli...
    Check out my Twitch, Twitter, Discord more at t3.gg
    S/O Ph4se0n3 for the awesome edit 🙏

КОМЕНТАРІ • 170

  • @t3dotgg
    @t3dotgg  Місяць тому +88

    My explanation of the "model size" was not great. I liked this more concise description from @essamal-mansoury2689
    "let's say you have a line equation (f(x) = mx + b). The x is the input, the m and x are the biases and weights. Only two weights in my example. 405 billion in their case."

    • @game_time1633
      @game_time1633 Місяць тому +17

      also, meta's code for training the model is open. Only the data is closed

    • @ech0o0o0o0z
      @ech0o0o0o0z Місяць тому +16

      m are the weights and b are the biases

    • @jwickerszh
      @jwickerszh Місяць тому +5

      Even that's not great, that example has one weight and one bias, but a simple way to explain it is the number of parameters that define the operations the model has to make.
      That's also why you can only run the smaller models locally (with useable performance), because every parameter has to fit in VRAM to perform those operations in the GPU as quickly as possible, and why quantization is a thing, and why LoRA is great, etc ...

    • @incription
      @incription Місяць тому +5

      terrible example

    • @RPG_Guy-fx8ns
      @RPG_Guy-fx8ns Місяць тому +3

      If your layer is 128 neurons wide, there would be 128 weights multiplied by 128 activations from previous layer, summed together, then you add 1 bias, and a Relu function, to calculate the activation of 1 neuron, then do all that for the rest of the neurons in the current layer before moving on to the next layer. The weights belong to the neuron you are calculating, so the next neuron will have a different set of 128 weights to multiply with the same 128 activations from the previous layer. Each neuron would have 128 weights and 1 bias.

  • @nuttygold5952
    @nuttygold5952 Місяць тому +21

    I think this video highlights that influencers talk about subjects like they understand them, but don't.

  • @essamal-mansouri2689
    @essamal-mansouri2689 Місяць тому +130

    405b is the number of weights, not the amount of data it was trained against. For example, let's say you have a line equation (f(x) = mx + b). The x is the input, the m and b are the weights and biases. Only two weights in my example. 405 billion in their case.

    • @cinoss5
      @cinoss5 Місяць тому +5

      small typo: m and b are…

    • @worldadmin9811
      @worldadmin9811 Місяць тому +2

      you mean m and b ?

    • @theairaccumulator7144
      @theairaccumulator7144 Місяць тому +7

      Once again more clueless yapping from Theo's side. The only reason I'm subscribed to him nowadays is to drop a dislike every time he spews garbage and presents it as fact so hopefully less people get deceived.

    • @warrenarnoldmusic
      @warrenarnoldmusic Місяць тому

      We have networks that dont count biases

    • @essamal-mansouri2689
      @essamal-mansouri2689 Місяць тому

      @@worldadmin9811 yeah sorry fixed it

  • @zanfur
    @zanfur Місяць тому +107

    That's very much not what the "billion numbers" mean. It means how many parameters are in the neutral net. There's waaaaaaay more data than that it was trained on.

    • @zanfur
      @zanfur Місяць тому

      You're also wrong about the open source bit. It's a public github repo, facebookresearch/xlformers . It's very much open source.

    • @themprsndev
      @themprsndev Місяць тому +15

      Yup. I was shocked at how wrong he got it lol. Llama 3.1 was trained on 15 T tokens btw.

    • @primafitra5610
      @primafitra5610 Місяць тому +2

      *neural

    • @ramanpreetsingh7463
      @ramanpreetsingh7463 Місяць тому +2

      come here to say this

  • @dungeon4971
    @dungeon4971 Місяць тому +19

    the source code used to train the model is also open only the dataset used in not available to the public. Also it's more accurate to say that the models are open weight than anything else because meta's licensing terms only allow for commercial use up till a certain threshold.

  • @MommysGoodPuppy
    @MommysGoodPuppy Місяць тому +12

    llama source code is available actually, just a few hundred lines of pytorch, the data it was trained on is the part thats closed

  • @canofpulp
    @canofpulp Місяць тому +43

    The model size is what the B means. Not the amount of data used for training

    • @SAsquirtle
      @SAsquirtle Місяць тому +14

      thanks for the comment. there's no point watching this video if he doesn't even know that

    • @infinitivez
      @infinitivez Місяць тому +7

      It's neither the model size or the "amount of data used for training"
      It's literally the number of parameters used to train the model, in billions.
      Why you'd discount an entire video over something so lowlevel, when Theo literally said he's paraphrasing how this stuff works, is beyond me. Especially when neither of you know what that number means.

    • @SAsquirtle
      @SAsquirtle Місяць тому +2

      @@infinitivez pretty sure that it's obvious model size implies number of parameters lmao

    • @infinitivez
      @infinitivez Місяць тому

      @@SAsquirtle whatever you say. because that doesn't at all get confusing when they get conflated with datasets or download sizes.

    • @SAsquirtle
      @SAsquirtle Місяць тому

      @@infinitivez um no it doesn't? I mean sure if you're new to llms then kinda I guess

  • @ugotisa
    @ugotisa Місяць тому +28

    It is not totally open source but can't deny it's better than *open* AI

    • @ChristianSasso
      @ChristianSasso Місяць тому

      If with better you mean 'gratis', then I have to agree, but if we dig a little deeper, the only difference is that you have to pay a monthly fee to use OpenAi's models. That's it: they both are closed source, in the sense that scripts and the procedures to generate the weights are secret in both cases, no different than the source code of an application whose binary is provided either for free (Meta) or for profit (OpenAI).

  • @RajarshiKhatua100
    @RajarshiKhatua100 Місяць тому +2

    theo's slowly entering the world of practical comedy here

  • @vsolyomi
    @vsolyomi Місяць тому +5

    "Genuine" and Zuck in one sentence is a huge red flag, imo. If Theo ever uses "sincere" and Zuck in one sentence I'll start reconsidering my life choices...

  • @infinitivez
    @infinitivez Місяць тому +21

    Totally agree, what meta is doing, is not open source. We also have no idea what datasets they pulled from, still.
    These companies really do enjoy trying twist the term into anything that suits them. Had me excited for a bit, until I found out, oh it's just the model data.
    Really not doing any of us any favors. They take all this university work, steal it under the guise they will be open with the results, lie, and then CLAIM it's open source.
    Which is weird, because aren't they talking about companies being able to train their own models? Looks like the only way you can create a model at this time, is by using hugging face's web portal, which sort of defeats the entire purpose of being offgrid.
    It's open model, not open source.

    • @Sanchuniathon384
      @Sanchuniathon384 Місяць тому +1

      Yeah it would be nice to open-source an entire model pipeline, e.g. Training Data, Transformations, Model Training with environment and parameters, and output Model.

    • @RPG_Guy-fx8ns
      @RPG_Guy-fx8ns Місяць тому

      If Hugging Face is malicious, everyone is hacked.

    • @lessmoneylessproblems5145
      @lessmoneylessproblems5145 Місяць тому

      Speak for yourself, the smaller LLama models are very useful for customer service use cases.

    • @infinitivez
      @infinitivez Місяць тому

      @@lessmoneylessproblems5145 Not saying AI, or the models Meta aren't useful. I'm saying them abusing the term open source, isn't doing anyone, any favors.
      Them releasing the actual source for the modeling would be even more useful, and something I'd love to see.
      But go off I guess?

    • @infinitivez
      @infinitivez Місяць тому

      @@RPG_Guy-fx8ns only takes a data breach. And most companies don't want to spend the time necessary, uploading and dealing with 3rd party licenses that are subject to change.

  • @VivekYadav-ds8oz
    @VivekYadav-ds8oz Місяць тому +8

    I'm not listening to someone's AI takes when they don't even understand the title of the model. (I am listening though, but only for hate watching).

    • @azmah1999
      @azmah1999 Місяць тому +1

      Why are you hurting yourself?

    • @arvininer
      @arvininer Місяць тому

      😂

  • @nicosilva4750
    @nicosilva4750 Місяць тому +3

    Even if you had the source to create the model, you don't have the data either, which Meta's own description describes as crucial for the formation (cleaned data) of the model. However, the model does include the weights, and it isn't "agentified" the way some other models are (Mistral). This allows you to do your own fine tuning and agentification with a lower incidence of loss.

  • @television9233
    @television9233 Місяць тому +3

    4:45 the number is NOT the number of tokens... it's the number of parameters. Those two are very separate things.

  • @happykill123
    @happykill123 Місяць тому +11

    Can't wait to put my binaries on GitHub and cal my project open source

  • @gbjbaanb
    @gbjbaanb Місяць тому +3

    Bryan "he who shall be " Lunduke had a great vid/article about this, the OSI themselves are pushing a not-very-open model for AI. Sure, the source code to build an AI is open source, but that's all. The important part (ie which creators have had their work stolen to build the dataset) is most definitely not included. There's a lot of money involved in this, so don't expect any of the open source foundations to care about open source any more.

  • @vorpled
    @vorpled Місяць тому +2

    Now it makes perfect sense why Nvidia has been holding its low and mid-tier graphics cards to 8GB of VRAM, even though it would cost them $20-50 maximum to double that.

  • @0xmassive526
    @0xmassive526 Місяць тому +7

    Funny how the thumbnail says misleading while the 405B explanation is straight up wrong HAHAHAHAHAHAHAHA

  • @putnam120
    @putnam120 Місяць тому +16

    They won't release the code and data used to create the model cause you know that there is data in there that they legally shouldn't have used. Basically a big "cover your ass" move.

    • @themprsndev
      @themprsndev Місяць тому +4

      @@putnam120 the code is open source, only the data is closed.

    • @putnam120
      @putnam120 Місяць тому

      You need the data to reproduce the model. So in a way the data is actually the real code.

    • @themprsndev
      @themprsndev Місяць тому +4

      @@putnam120 yes. Still, the code is open source. This is an important thing to note, because with the code you can train your own models on your own data.

    • @putnam120
      @putnam120 Місяць тому +1

      @@themprsndev oh I agree that having it is also important. But the code to train is not the real challenge in this area

  • @noone-ld7pt
    @noone-ld7pt Місяць тому +23

    Yea you should probably learn a bit more about this before releasing a video on it. Some pretty glaring mistakes in this one.

  • @Tylerlaws00n
    @Tylerlaws00n Місяць тому +6

    Do you invest in corona though? 🤔

  • @ElevateConsultingDave
    @ElevateConsultingDave Місяць тому +1

    @t3dotgg, great post and analysis. I enjoyed your boiled-down description of an LLM. I also like to think that training on a data set is similar to compression technology. You feed a model a lot of data, and you're left with a weighted neural net that represents that data but is much smaller than the training data.

  • @grayaj23
    @grayaj23 Місяць тому +3

    It does sit a bit odd thinking of Meta as being the smart custodian of the future.
    I can't imagine Zuckerberg not wanting to use AI tools to drive engagement and further quantize and isolate human behavior.
    Make sure to count the silverware. I'll say this at least, if we can figure out what his angle is and sanitize it so that it doesn't expose people to more targeted misinformation, at least his motives are trustworthy. He's the profound cynic's dream operator -- his motives are transparent and therefore (at least can be) predicable.

  • @acegear
    @acegear Місяць тому +1

    i hope how meta uses it as where their closed source frontend or back end insert stuff you dont want want and pass it to meta ai , like nothing happend , like gemini where they inject additional keywords they want at the front and merge with your search

  • @josechristianromero334
    @josechristianromero334 Місяць тому

    Hey Theo!, thanks, I love how you broke down this post, that's super cool.

  • @professormikeoxlong
    @professormikeoxlong Місяць тому

    8:02 wait this is actually pretty damn interesting to read through

  • @michaelkershaw7231
    @michaelkershaw7231 Місяць тому +2

    if the model was unquantized it probably would be closer to a terabyte

  • @mchisolm0
    @mchisolm0 Місяць тому +17

    Yeah, maybe "free software", but definitely not "open source".

    • @AncientSocrates
      @AncientSocrates Місяць тому +6

      Free software is an underset of open source. Maybe it's just freeware.

    • @mchisolm0
      @mchisolm0 Місяць тому

      @@AncientSocrates Yeah, I like that a lot. I like that it is freemium adjacent.

  • @dipereira0123
    @dipereira0123 Місяць тому

    Its the same strategy Bill did when he figured out people were pirating windows, he didnt care, it was even better for business that people join the market with windows knowledge, forcing companies to adopt it instead of Mac

  • @maximousblk
    @maximousblk Місяць тому +3

    thats the wrong logo bru

  • @mkDaniel
    @mkDaniel 16 днів тому

    The 40b models are usually trained on 3 trilion tokens

  • @zaneearldufour
    @zaneearldufour Місяць тому +1

    Unless they've published a solution to MechInterp, none of these models should be considered "open source".
    "Open weights"? Sure

  • @math-s
    @math-s Місяць тому

    Thanks Corona for supporting Theo 🚀

  • @ChristianSasso
    @ChristianSasso Місяць тому +1

    Just a clarification.
    When the binaries of an application are free, but the source code is not, we say that the application is gratis to use, but it is not free or open source.
    Similarly, when the weights of an AI model are free, but the scripts and procedures to produce them are not, we say that the AI model is gratis to use, but it is not free or open source.
    In short:
    binary : source code + build system == weights : scripts + procedures

  • @ech0o0o0o0z
    @ech0o0o0o0z Місяць тому +3

    More apt term is Open-weights models

    • @MommysGoodPuppy
      @MommysGoodPuppy Місяць тому +2

      llama source code is open too, infact its just a few hundred lines of pytorch, the data though is not open

    • @ech0o0o0o0z
      @ech0o0o0o0z Місяць тому

      @@MommysGoodPuppy true. A true open source and weights model that I know of is Amazon's Chronos time series model. They have thus far released gpt2, t5 based time-series transformer model weights on huggingface, pytorch source code to pre-train and fine-tune models, synthesized datasets and associated libraries source code they developed to generate a large semi-empirical time-series datasets.

  • @just0focus
    @just0focus Місяць тому +1

    Agreed, it may be free, but not rlly open.

  • @akam9919
    @akam9919 Місяць тому

    I'm disappointed that Ollama's response wasn't just "The." 7:00

  • @jacmkno5019
    @jacmkno5019 Місяць тому

    Correction: The number of parameters has nothing to do with the training data set, it's the number of weights in the whole network, so it talks about the architecture of the network, not the training dataset. That's the reason those numbers are so large.

  • @imfilou
    @imfilou Місяць тому

    Hello Theo, I comment rarely but I wanted to share. There's a project I just discovered called "Ever UI" which is another shadcn like component library. I don't like the way they are positioning themselves, but he has interesting ideas.
    What I don't like is that he took components from other component libraries to put in his own, even though he made several syntax improvements to make base components more readable and more maintainable, I wish he made his own.
    He has wrappers to make evey components animated, custom hooks and all. It seems promising in its concept but it's just one guy coming from nowhere so I dont know how to feel about it. I'm just here to share, love your content, have a nice day.

  • @Fanaro
    @Fanaro Місяць тому

    It's not that this is an iffy case for open source. It's not open source, period.

  • @danielratiu4318
    @danielratiu4318 Місяць тому

    My English is not that good. This is what I understand. Trust in Meta is low to zero and sales would be difficult. So in order to cripple competition they release the models for free and calling that open source. Did I get it?

  • @nicejungle
    @nicejungle Місяць тому

    Models are data, like a picture or a database dump.
    Therefore "opensource models" are non-sense.
    That's why free data are covered by Creative Common licence and source code by GPL2 or MIT licences

  • @boredstudent9468
    @boredstudent9468 Місяць тому

    I don't think jail breaks are a big problem, the creation of ABC weapons doesn't really fail on the general knowledge, but on application, resources and adversaries (e.g. CIA sending you a Hellfire)

  • @aLfRemArShMeLlOw
    @aLfRemArShMeLlOw Місяць тому

    Also, Zuckerberg stays at Meta because he's got a tight "CEO 4 life" clause and cannot be ousted, which was/is not the case for all other tech founders...

  • @Karurosagu
    @Karurosagu Місяць тому

    Finally, the AI bubble is deflating

  • @Caldaron
    @Caldaron Місяць тому

    how much for a product placement? ;-)

  • @ソンタック
    @ソンタック Місяць тому

    It's definitely semi-open source since neither the training data nor the training code is provided

  • @Brymcon
    @Brymcon Місяць тому +1

    Look up how much the cost has come down in a year. Intelligence is about to be free and you'll be able to run it on your phone. Mistral large is a fraction of the size and is frontier.

    • @ElmerGLue
      @ElmerGLue Місяць тому

      It is nowhere near free. Might be able to diverge some costs such as a gpu instead of a heater but gas is still cheaper in many areas if available. Also since every expects large gains from gpu workloads everyone is adding their margins ahead of time.

  • @Fanaro
    @Fanaro Місяць тому

    31:25 That's also basically the Slippery Slope Fallacy. There are tons of things we have today (like legislation) that could be extrapolated into oblivion. (I don't like current AI btw.)

  • @abstractionGod
    @abstractionGod Місяць тому +1

    The beer makes everything better

  • @morgan0
    @morgan0 Місяць тому

    i’d rather see advancements in ok models that run really fast on consumer hardware, like apple can probably do ok on any apple silicon device because the gpu is decent and shared memory makes it able to fit the model data in, but like, i think we could do better, at least for question answer reasoning. maybe have more than one system instead of super complex autocomplete. maybe something that can reason spatially, or something which can do math better, and take a lot of common tasks and make a more specialized system which is better with less computation. then combine all that with a smaller word token predictor which can fill in the gaps and connect the other systems. maybe there’s more complexity inside neurons but i think we can learn something from the way the brain is structured, with many specialized regions which handle certain tasks directly, connected together to be better than the sum of their parts.

  • @ktb1381
    @ktb1381 Місяць тому

    Who cares how pretty the ecosystem is if the accuracy isn't great? I am h o accuracy is king on these things.

  • @ZeljkoRacki-wy7tc
    @ZeljkoRacki-wy7tc Місяць тому

    Great video 👍

  • @wesleycoder
    @wesleycoder Місяць тому

    Corona should sponsor Theo

    • @wesleycoder
      @wesleycoder Місяць тому

      ABInbev get in there, I want to see Theo say no to your face, 😂

  • @TheApeMachine
    @TheApeMachine Місяць тому

    Then there would be this one guy, which was somewhere that the A.I. missed, and now he's the only one who still remembers the Beatles. Funny thing though, he is also a failed musician, and then he starts to release Beatles songs as his own, and become like super famous. Then Ed Sheeran has a cameo and it'll all be pretty much downhill from there...

  • @quemediga
    @quemediga Місяць тому

    8:35 that's not silly nor it's something new. It's a way of justifying some deterministic beliefs, such as "everything happens for a reason", which has been out at least since Aristotle.

  • @fahadrx8fy
    @fahadrx8fy Місяць тому

    I guess the best terminology for what Meta is doing with their models is 'Open AI' instead of open source but unfortunately that terminology is taken

  • @code_of_honour
    @code_of_honour Місяць тому

    llama 3 still fails at how many r's are in " Renderer "

  • @oaklyfoundation
    @oaklyfoundation Місяць тому

    80 talet vart tog du vägen

  • @Strammeiche
    @Strammeiche Місяць тому +1

    I'm still wondering why he spends so much money and then giving it away.

  • @survivor303
    @survivor303 Місяць тому

    Damn i hate these pinkshirts about ethics, if you have bad intuitive, then you have this tool for you, if you have good intuitive, then you have this tool for you. It is that simple. Freedom is freedom to everyone.

  • @the_yugandharr
    @the_yugandharr Місяць тому

    why is your github "code" button blue? some new theming shit?

  • @longnamedude3947
    @longnamedude3947 Місяць тому

    No, their AI is not Open-Source.
    It is technically Shareware, or a derivative of Freeware.
    But it is certainly not Open-Source, no new definition is required.
    Anyone that can't understand how to use a search engine to go to the Free Software Foundation website to understand what these terms are defined as simply should just not say anymore.on the subject, do your research or move on to another subject where you are informed of what is being discussed.
    Additionally the video creator Theo needs to do some more basic research on what all.of these terms mean in relation to AI as it is very clear that he does not fully understand the definitions of the subject matter at hand, and so it just makes it even harder for those that are informed to contribute constructively to the discussion.
    Thanks for highlighting Mark's misuse of the term "Open-Source", for someone so smart he knows he is misusing the term to try a gain a advantage and we need to make sure that him and his platform do not have any pillars to stand on, they are squarely in the wrong with how they have obtained data for these "AI Models" for training purposes, and they need to be held accountable for their actions which I believe most people would consider are illegal since they didn't have explicit permission from the content owners and on multiple occasions they have violated all.sorts of different legally binding License's used by millions of different platforms & people, each of which is owed individually agreed upon compensation or removal of their data from any derivative software from the original source and the removal of the original source itself as would be applicable if this was a physical painting that had been copied for example.
    Invalidating these legally binding documents will technically invalidate every legally binding document that exists because it is a direct violation of the rights of the contents owners, software owners, contributors, and, hosting service providers/publishers.

  • @4lc0h0l
    @4lc0h0l Місяць тому

    Hmmm two important errors. 1st thats ollamas logo not Metas llama AI models. Also when explaining the models and sizes your mixing a lot of terms wrong

  • @yinyangphoenix
    @yinyangphoenix Місяць тому

    I’ll just bet that their AI is designed to keep making people into products. It’s all the guy wants to do.

  • @anthonylancer
    @anthonylancer Місяць тому

    i dont know why people are disliking this video. I only disagree on the fundamental implication that Zuck is a free agent and not CIA/Mossad. Aside from that good video.

  • @anonymouscommentator
    @anonymouscommentator Місяць тому +2

    sorry theo but i havent seen such a bad video in a long time. you might want to take that down.
    as you understood by now your model explanation was utter garbage and highly misleading (which you call "not great" in your pinned comment... no it was completely wrong).
    also the llama sourced code IS available on their github. your entire video revolves around the source code not being available and modifiable when that is just plain out wrong. also nowhere in the definition of open source does it state that they would have to include the training data. they offer the source code and even the weights. this is by definition open source. you can go and study the architecture or modify the source code or just use it as is. for example mistral only releases the weights without the source code which is hence called open-weight but not open-source.
    i get that you have bills to pay but this video is just you reading an existing article and then spreading misinformation literally the entire time. its such a shame because literally 2 mins of google could have prevented this.

    • @joeyhicks1670
      @joeyhicks1670 Місяць тому +1

      Ayyyye there’s the comment I was looking for

  • @professormikeoxlong
    @professormikeoxlong Місяць тому

    SpongeBob SA what? SpongeBob San Andreas?

  • @that_matt
    @that_matt Місяць тому

    So it's shareware...

  • @reinduhr
    @reinduhr Місяць тому

    If Meta's LM is not open source, then what is it?

  • @thisaintmyrealname1
    @thisaintmyrealname1 Місяць тому

    didn't know Meta also gave us PyTorch! I honestly think the world would be a better place without Meta's products😅 but it would be a worse place without their great open-source libraries!

  • @PhilipKNguyen
    @PhilipKNguyen Місяць тому

    theres a lot of shit that beyond bleeding edge which i nick name "stabbing edge" if criminals get that shit in their hands it is not a pretty thing!!

  • @noah12121
    @noah12121 Місяць тому

    if the danger of ai is that there is some chemical reaction i can do in my garage the destroys the world i think we r ok...

  • @pehclark7256
    @pehclark7256 Місяць тому

    Llama 3.1?

  • @_aNeaire
    @_aNeaire Місяць тому

    Dude i just played universal paperclip almost an hour now, try it!

  • @vorpled
    @vorpled Місяць тому +1

    There is a piece of the picture missing from this.
    He explains that they've released this model to benefit everyone.
    He leaves out that they make the money to develop the software to help everyone by exploiting everyone.

  • @jeanpepin5869
    @jeanpepin5869 Місяць тому

    Artificial intelligence is knowing joe who answers all the questions because when it doesn't know, it has ersatz to sell. When you are naturally intelligent you don't need artifices like economics and cocaine. ;)

  • @pempheromkuka7874
    @pempheromkuka7874 Місяць тому

    Find new word then cause open source is the closest word to describe what meta is doing

    • @overwrite_oversweet
      @overwrite_oversweet Місяць тому +1

      Freeware and shareware were never open source.

    • @pempheromkuka7874
      @pempheromkuka7874 Місяць тому

      @@overwrite_oversweet sounds like a name of a scam 😂😟

  • @KeyStorm
    @KeyStorm Місяць тому

    Super Sayan Transformation Theo?

  • @aeronwolfe7072
    @aeronwolfe7072 Місяць тому

    HA! you said it sounds like something Terrance Howard would say.... LOLOLOL

  • @Spinikar
    @Spinikar Місяць тому

    Hahah I don't program sober either

  • @JacopoPatrocloMartinelli
    @JacopoPatrocloMartinelli Місяць тому

    C’mon! A Corona is basically e tea

  • @stephenjames2951
    @stephenjames2951 Місяць тому

    omg corona and you live in SF!

  • @stephenleblanc4677
    @stephenleblanc4677 Місяць тому +1

    P.S. Find some real beer to drink.

  • @c99a2o
    @c99a2o Місяць тому

    try "04-x"

  • @modoulaminceesay9211
    @modoulaminceesay9211 Місяць тому

    Mistakes sre allowed but not always

  • @modoulaminceesay9211
    @modoulaminceesay9211 Місяць тому

    Its linux not the same thing

  • @worldadmin9811
    @worldadmin9811 Місяць тому +1

    ethics being thrown out the window at paces never thought before possible

  • @ronniebasak96
    @ronniebasak96 Місяць тому

    Well, your overlay says ollama and you say meta. You are clearly going insane over time. I would suggest taking some mental health advice from ThePrimeagen

  • @XMaster96DE
    @XMaster96DE Місяць тому +1

    I am sorry Theo but your explanation about the model sizes is incorrect.
    The B stands for Billions of parameters, meaning the number of variables you have in your mathematical function to approximate a source data distribution.
    PS: all the llama3 models got trained on around 15T Tokens, so 15 Trillion word pieces

  • @ebert7955
    @ebert7955 Місяць тому +3

    "open source" indeed

  • @aeronwolfe7072
    @aeronwolfe7072 Місяць тому

    you NEED to throw a LIME in that Corona.... :)

  • @MunishMummadi
    @MunishMummadi Місяць тому

    Dude come on 😂. What more do u want.

  • @thomassynths
    @thomassynths Місяць тому

    Zealots like to stupidly pretend that OpenAI is written as OpenSourceAI.

  • @modoulaminceesay9211
    @modoulaminceesay9211 Місяць тому

    Linux is the same

  • @stephenleblanc4677
    @stephenleblanc4677 Місяць тому

    You're adorable!

  • @12crenshaw
    @12crenshaw Місяць тому

    I'm sorry but you look ridiculous chugging Corona and saying "I need to drink more"

  • @jeremyatlas4504
    @jeremyatlas4504 Місяць тому +4

    IM GHE FIRST VIEW

  • @mr.random8447
    @mr.random8447 Місяць тому

    Ai not cost effective

  • @parimalTeK
    @parimalTeK Місяць тому +1

    Just like the thumbnails of this video "misleading" .... This video is misleading.... This is what happens when you become UA-cam and make videos for views