CF OT + OLLAMA API
Hey there,
We're trying to set up a local AI )OLLAMA) with CF Tunnels via docker and was wondering how one might go about securing OLLAMA API with an auth token?
There is a basic auth header but where should that go?
How we imagined things working is:
URL: ai.example.tld (We could have CF direct users to the path: v1/chat/completions)
Currently if someone hits ai.example.tld it says OLLAMA is running just by sending them to the local service: ollama:11434
Isn't that a security risk because there is no authentication?
Can we have it where you hit ai.example.tld, authenticate with CF Access application or maybe a Service auth token so that you can just hit ai.example.tld with the token to authenticate against?
Sorry for going all over the place...
4 Replies
Anyone have thoughts?
Yes its a pretty big security risk I wouldn't leave it like that.
There's actually several options you could choose from, each with their own quirks:
- you could use Cloudflare service auth tokens, this would require adding custom headers to your application to authenticate, difficult if you plan to use e.g. the openai-compatible endpoint
- you could use third party applications to add bearer auth e.g. https://github.com/stephan-buckmaster/ruby-bearer-auth-proxy which should work better with openai libraries if you need them
- you can setup a regular cf access login and then use the
cloudflared access
commands on the machine needing to access it to securely proxy the traffic back to a localhost address on that hostI think I may have found something but how does this solution sound?
https://developers.cloudflare.com/cloudflare-one/tutorials/access-workers/
Cloudflare Docs
Create custom headers for Cloudflare Access-protected origins with ...
This tutorial covers how to use a Cloudflare Worker to add custom HTTP headers to traffic, and how to send those custom headers to your origin services protected by Cloudflare Access.
My specific use case is using OLLAMA via Docker, I'm also running Open WebUI but I'm wanting to have flexibility for other services such as Brave BYOM Leo local LLM chat addon.