Pure Rust Serverless AI Chat Bot with a WASM frontend hosted statically on Github Pages.
Вставка
- Опубліковано 7 лют 2025
- The Fireside Chat (prev. Candle Chat) Bot is implemented in pure Rust using Mistral-7B with HuggingFace/Candle over Axum Websockets (mutli-user) and a Leptos (Wasm) frontend using the Leptonic UI library!!!
The `serverless` branch now demonstrates how to host the frontend on GitHub Pages and the backend on Shuttle.rs 🚀
Github:
⭐Fireside Chat: github.com/dan... ⭐
🔙 Backend
Axum: github.com/tok...
HuggingFace/Candle: github.com/hug...
💻 Frontend
Leptos: github.com/lep...
Leptonic: github.com/lpo...
Docs:
Shuttle.rs: docs.shuttle.r...
Github Pages: docs.github.co...
Discord:
HuggingFace/Candle: / discord
Leptos: / discord
Shuttle.rs: / discord
Thank you so much ! I was trying to learn Rust by working on a chatBot project with quantized Mistral for non-english language with Candle, but with no example to follow on this particular setup, it was quite hard for me. Now, I have a whole repo to learn ! 👏
Fantastic!
That's exactly what I created this for.
I'll be improving things a bit soon - once the new year settles in.
Awesome!! I’m hoping for more videos like this! I’m looking into candle and wonder if it can be implemented with RAG for PDFs. I’m sure there’s RAG that can work well with rust and candle. If you ever get around to it, that will be very neat to see :)) but thank you for this video
I plan on doing RAG for text/PDF eventually, but it's a bit down in the list of priorities.
I will try to remember to comment here when get to it. :D
@@danielcloughyou’re awesome; ya on your own time whenever you get to it. Putting this out for free is enough and grateful :)
Only left channel of audio.
Thank you for pointing this out.
I'll be more careful next time.