Browser-native LLMs at your fingertips
Unleash the Power of AI in Your Browser!
WebextLLM is the first extension to embed large language models in the browser. The LLMs run in an isolated environment within the extension with zero configuration or external dependencies, making it the easiest method to run local inference on any platform (with 7 supported LLMs for you to choose from!).
Experience a growing ecosystem of AI-based applications (with window.ai) as a user, and fuel it further as an application developer.
Key Features
* Own your models: Experience the freedom of owning your LLMs, enjoying a limitless, offline, private, and secure environment
* Control: Exercise complete control over access to your LLM, granting or denying permission to any application at any time
* Visibility: Gain insights into the history of prompts and responses to your model across different applications
Why should you use WebextLLM? It's:
1. Free: The model runs on your hardware, eliminating the need for costly service providers
2. Private: Your data remains securely on your device, safeguarding your privacy
3. Unlimited: Take full control of the model without any quotas, censorship, or limitations
4. Highly available: Overcome limitations of internet connectivity and cloud-based LLM availability
Disclaimer
This extension is a proof-of-concept using experimental technologies. It is not recommended for production. By using this software, you assume all risks, including potential data loss, system failure, or other issues. The models and applications featured in this project are not endorsed by the author and the author is not responsible for any issues arising from their use.