This extension hosts an ollama-ui web server on localhost
How to setting prompt
Yep, it's true, only work with Ollama on localhost. But my Ollama turn on another server exposed by openweb-ui. So I made a reverse proxy http://api.ai.lan -> 10.XX.XX.XX:11435 But the extension can't access it. Then I also tested with the direct IP : http://10.1.33.231:11435 But you force the default port: failed to fetch -> http://10.1.33.231:11435:11434/api/tags Finally, I made a ssh tunnel: ssh -L 11434:localhost:11435 [email protected] It's work, but not sexy
https://coddingtonbear.net
https://spchatgpt.com
death.au
https://codeium.com
Perplexity AI
https://joplinapp.org
https://chathub.gg
https://open-os.com
https://highlightx.ai
https://windowai.io
Karthikeya
https://www.perplexity.ai