LLMs are coming to the browser.
Can we generate embeddings and store it as vectors for WebLLM's context, all inside browser? Main goal being providing a all in one local in browser solution for Freemium LLM users for an app I'm planning to build.
Can we generate embeddings and store it as vectors for WebLLM's context, all inside browser? Main goal being providing a all in one local in browser solution for Freemium LLM users for an app I'm planning to build.