Bolt.new + Ollama: AI Creates APPS 100% Local in Minutes

Поділитися
Вставка
  • Опубліковано 12 лис 2024

КОМЕНТАРІ • 61

  • @noahgottesla3439
    @noahgottesla3439 3 дні тому

    Yo. You just make the tedious process of the trifles into a juicy episode which is amazing for long term style for UA-cam algorithm

  • @fourlokouva
    @fourlokouva День тому

    Thank you for your contributions Mervin! Including a repo link in description would be helpful, just saying

  • @mrsebbig
    @mrsebbig День тому

    many thanks for the great video.
    one question: where is the LLM downloaded to? i want to make space again to try another LLM. how can i delete the 4GB again?

  • @MIkeGazzaruso
    @MIkeGazzaruso 15 годин тому

    Can this generates also backend or only frontend? If only frontend it's a waste of time.

  • @d.d.z.
    @d.d.z. 4 дні тому

    I like this approach. Thank you Mervin

  • @CraigRussill-Roy
    @CraigRussill-Roy 4 дні тому

    Love your videos - nice work on showing warts and all

  • @MagicBusDave
    @MagicBusDave День тому

    Just can't get ollama models to appear under ollama. 2 hours of diagnosing with claude and still nothing. everything appears to be running.

  • @sefyou4171
    @sefyou4171 День тому

    Can you show, how Can we use other LLM not only qwen2.5-large:7b?

  • @viangelo4z595
    @viangelo4z595 4 дні тому

    muchas gracias

  • @HikaruAkitsuki
    @HikaruAkitsuki 3 дні тому

    How to know the context max length for each parameters?

  • @naturelife418
    @naturelife418 4 дні тому

    Great job, well done!

  • @munduzikkachok1936
    @munduzikkachok1936 9 годин тому

    why the hell when you install not asking fro API key but when i don it is?

  • @tonywhite4476
    @tonywhite4476 4 дні тому

    This is awesome!!!

  • @John-ek5bq
    @John-ek5bq 4 дні тому

    What is the best llm for coding apps?

  • @mr.gk5
    @mr.gk5 4 дні тому

    Can I also use openai api for this application?

  • @JNET_Reloaded
    @JNET_Reloaded 4 дні тому

    do we need graphics card for this?

  • @naturelife418
    @naturelife418 4 дні тому

    for some reason my instance defaults to anthropic no matter if i select ollama, the reason i discoverd that is that the anthropic key was not set and it complained about it in the error output even when ollama models was selected.

  • @ShaikSadiq-zs6yj
    @ShaikSadiq-zs6yj 2 дні тому

    Where can I search modelfile

  • @wobbelskanker
    @wobbelskanker День тому

    ive done everything several times but im not getting the actual code and files

    • @MervinPraison
      @MervinPraison  День тому

      Did you try increasing context length ?
      Also try various models

    • @wobbelskanker
      @wobbelskanker День тому

      @@MervinPraisonthanks for reply, do i need to change modelfile or any of the commands if i change model? also shouold i increase more than you have set?

  • @John-ek5bq
    @John-ek5bq 4 дні тому

    Is bolt.new using claude by default?

  • @ShyamnathSankar-b5v
    @ShyamnathSankar-b5v 4 дні тому

    I have used gemini-1.5-pro but now also not working Gemini has context length of 2M there is some problem with the software only

  • @ZeeQueIT
    @ZeeQueIT 2 дні тому

    well why not use open router api key to test the local bolt. there you can find the big models for free to use and the bigger context length as well

  • @mikevanaerle9779
    @mikevanaerle9779 3 дні тому

    how do you make the modelfile?? what file does it has to be? I have no clue how to make this in my CMD prompt. ( windows computer )

    • @MervinPraison
      @MervinPraison  3 дні тому

      Right click and create a new file. Name it as modelfile

    • @mikevanaerle9779
      @mikevanaerle9779 3 дні тому

      @MervinPraison Thanks. But what kind of format file do I need to make. Just a folder, txt, ,?

    • @MervinPraison
      @MervinPraison  3 дні тому

      @@mikevanaerle9779
      Create modelfile.txt and run below command
      ollama create -f modelfile.txt qwen2.5-large:7b
      Doc: mer.vin/2024/11/bolt-new-ollama/

    • @mikevanaerle9779
      @mikevanaerle9779 3 дні тому

      @@MervinPraison Thank you

    • @mikevanaerle9779
      @mikevanaerle9779 3 дні тому

      ​@@MervinPraison for me it does not see the preview, or code in the right

  • @Revontur
    @Revontur 4 дні тому +1

    good video as always.. but i think you missed out the creation of the .env.local file

    • @HikaruAkitsuki
      @HikaruAkitsuki 3 дні тому

      He assumed that you already know to use ollama and env tweaks.

  • @modoulaminceesay9211
    @modoulaminceesay9211 4 дні тому

    Thanks

  • @CalsProductions
    @CalsProductions 4 дні тому +2

    Hey Mervin you forgot to tell your viewers to change the .env file to use Ollama local API

  • @moonduckmaximus6404
    @moonduckmaximus6404 3 дні тому

    DOES THIS WORK ON WINDOWS?

  • @GusRJ70
    @GusRJ70 4 дні тому

    To run more than 7B , do will we need more RAM, right? 64gb or more?

  • @tomasbusse2410
    @tomasbusse2410 4 дні тому

    Can I install this from within VS terminal?

    • @MervinPraison
      @MervinPraison  4 дні тому +1

      Yes, You can use any terminal.

    • @tomasbusse2410
      @tomasbusse2410 4 дні тому +1

      @ oh great this really looks interesting will try to install it. Thanks.

  • @marinob7433
    @marinob7433 4 дні тому

    it works perfect but GPU is a must to have the speed. for me when i asked , it corrected some files

  • @shay5338
    @shay5338 4 дні тому

    you should have given him credit!

  • @JNET_Reloaded
    @JNET_Reloaded 4 дні тому

    Nice :D

  • @dylanwarrener5857
    @dylanwarrener5857 День тому

    I think this is a good start. But, it is still not that powerful and for someone who already codes fairly quick. I feel like this is much slower at the moment, still. Give it a couple years and I reckon this might be worth.

  • @writetopardeep
    @writetopardeep 4 дні тому

    what all are we talking here? API?

    • @carstenli
      @carstenli 4 дні тому

      This fork of bolt.new enables the use of any provider including local (on machine) inference provided by Ollama as in this example.

  • @ShaikSadiq-zs6yj
    @ShaikSadiq-zs6yj День тому

    Could please again you can do proper video explanation , this is not not good explanation sir

  • @clemenceabel5494
    @clemenceabel5494 3 дні тому

    Hey, I saw your videos. They're great and informative but your thumbnails are not appealing enough. I think you should hire a Professional Thumbnail Artist for your videos to increase your view count cause every impression matters. I can improve your ctr from 2-3% to 15%. Please acknowledge and share your contact details to get your thumbnail.

  • @Giulio.t
    @Giulio.t 4 дні тому +2

    I don't understand anything since the start.... You say "In your terminal" but what terminal? Dude you can't start a video tutorial by assuming certain things

    • @jstthomas1111
      @jstthomas1111 4 дні тому

      Visual studio code terminal or your preferred ide. You are cloning the GitHub repository

    • @zipaJopa
      @zipaJopa 4 дні тому +7

      You can't expect the tutorial starts with an explanation on how to turn your computer on.

    • @AnimalCentral-l7p
      @AnimalCentral-l7p 4 дні тому

      @@zipaJopa you must be the funniest person at home.. this youtuber didn't even share the git clone command as he claimed in his video..

    • @zipaJopa
      @zipaJopa 4 дні тому

      @@AnimalCentral-l7p but it's bolt.new-any-llm?

    • @carstenli
      @carstenli 4 дні тому +2

      Terminal / Shell / Console / Command Line all mean the same.