Thank you for your sharing However, I have a few questions about WebLLM that I’d like to ask: Which country or company/team developed WebLLM? Does WebLLM work based on the principle of running LLMs locally on a computer? However, it seems to allow the use of WebLLM's default models from website. In this case, is it considered local operation? What about the privacy risks of conversation data? If WebLLM uses default models from the internet, can I still use it completely offline after disconnecting from the network? Can WebLLM utilize locally downloaded GGUF models, similar to tools like Ollama or LM Studio? Thank you!
Like with NotebookLM, Google is doing a great job for the AI community. We the public regret even more that the corporate marketing department is so lousy to promote the results. Will it be always like so? 😢
They make easy money from search ads monopoly. That makes their whole sales and marketing very sloppy. It’s a culture problem that won’t be fixed easily.
@@chaomingli6428 I can't resist comparing Google to Xerox that let their PARC (Palo Alto Research Center) invent the mouse and the WYSIWYG concept and did nothing with it. Who still remembers Xerox?
3 years ago I run some models in tensor flow js, also on mobile. They were not performat but way cheaper than on server.
Humble beginnings back in 2018, but many of the TFJS Models are very fast now with WebGPU and such.
Excellent presentation! 👏
If models can be separated by purpose, many local on-device models are likely to emerge soon.
Excellent video
When will it become the part of specification?
GenAi was not meant to be specialize. You need a set of quality data to fine-tune AI and make it specialize.
Thank you for your sharing
However, I have a few questions about WebLLM that I’d like to ask:
Which country or company/team developed WebLLM?
Does WebLLM work based on the principle of running LLMs locally on a computer?
However, it seems to allow the use of WebLLM's default models from website.
In this case, is it considered local operation? What about the privacy risks of conversation data?
If WebLLM uses default models from the internet, can I still use it completely offline after disconnecting from the network?
Can WebLLM utilize locally downloaded GGUF models, similar to tools like Ollama or LM Studio?
Thank you!
Correct WebLLM works entirerly locally in the browser on your own hardware - no cloud.
Awesome!
Like with NotebookLM, Google is doing a great job for the AI community. We the public regret even more that the corporate marketing department is so lousy to promote the results. Will it be always like so? 😢
They make easy money from search ads monopoly. That makes their whole sales and marketing very sloppy. It’s a culture problem that won’t be fixed easily.
@@chaomingli6428 I can't resist comparing Google to Xerox that let their PARC (Palo Alto Research Center) invent the mouse and the WYSIWYG concept and did nothing with it. Who still remembers Xerox?
3rd like 😂
4th