Local chat and code completion with Cody and Ollama (Experimental)

Поділитися
Вставка
  • Опубліковано 8 лис 2024

КОМЕНТАРІ • 16

  • @mikexie-do5dd
    @mikexie-do5dd 7 місяців тому +2

    Does the cody with ollama also has limit like "500 Autocompletions per month" for free version?

    • @ado
      @ado 7 місяців тому

      No. You can use it as much as you want.

    • @Sourcegraph
      @Sourcegraph  7 місяців тому +1

      Nope!

  • @chrishardwick2309
    @chrishardwick2309 6 місяців тому +4

    my settngs.json file does not look like this. I do not have a cody.autocomplete.experimental.ollamaOptions but my settings are pointed at experimental - ollama. I cannot select an Ollama model and I feel like it's jnust defaulting to Claude 2.

  • @carlosmagnoabreu2827
    @carlosmagnoabreu2827 29 днів тому

    The "Experimental" Ollama Models are not being shown in the select models tab at my work computer. On my home computer they appear.

  • @AdityaLadwa
    @AdityaLadwa 2 місяці тому +1

    Is this available for Jetbrains editors?

    • @TyMac711
      @TyMac711 Місяць тому

      Would also like to see this happen.

  • @ConfusedSourcerer
    @ConfusedSourcerer 3 місяці тому

    When trying to use Cody with Ollama and a local LLM, the chat is working fine, but when using the setup as recommended in this video, the autocomplete returns chat suggestions instead of code. Any idea what's causing this and how to fix?

  • @adamvelazquez7336
    @adamvelazquez7336 6 місяців тому +2

    when I install cody I see no option at all to select ollama chat. has it been removed?

    • @adamvelazquez7336
      @adamvelazquez7336 6 місяців тому

      currently running on windows

    • @ado
      @ado 6 місяців тому +1

      Hey Adam - so if you seeing the option, you can add the setting manually in your settings.json file. Just add this property: "cody.experimental.ollamaChat": true, restart VS Code, and it should work.

    • @adamvelazquez7336
      @adamvelazquez7336 6 місяців тому

      @@ado interestingly, I got it working, and then the folowing day when I tried to use ollama, it wasn't working. without without me making any changes to my settings myself. what I did to get it working the first time was I changed the version of cody to the prerelease version(this one supported all the stuff described in the video) and just followed the steps in the video. then I restarted VSCode and it worked.
      Why it stopped working the following day I have no clue. it also keeps resetting the model back to claude 2

    • @ado
      @ado 6 місяців тому

      @@adamvelazquez7336 hey Adam - can you check your `settings.json` file and make sure that the "cody.experimental.ollamaChat": true, property still exists. If it does, then the local LLM's should load from Ollama, otherwise, they won't. It's possible that that flag got removed when you updated (although unlikely).

  • @maxint2680
    @maxint2680 3 місяці тому

    Somehow my autocompletion is broken since I moved to another place. Nothing suggested even I "trigger autocomplete at cursor".

    • @maxint2680
      @maxint2680 3 місяці тому

      Chats and other commands are working, tho