Deploying H2O LLM Studio /w auth using Ngrok
I have been working most of the day to get this container deployed to runpod. Here's the trick though. I included nginx in the mix and am using it as a proxy_pass. This way I can use some sort of auth.
Here is the nginx config.
events {
worker_connections 1024;
}
http {
# Include the default MIME types
include /etc/nginx/mime.types;
default_type application/octet-stream;
# Logging settings
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
# Configuration for the default server
server {
listen 80;
auth_basic "Restricted Content";
auth_basic_user_file /etc/nginx/.htpasswd;
location / {
proxy_pass http://127.0.0.1:10101;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# WebSocket support
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
# proxy_set_header Origin http://127.0.0.1:10101;
}
}
}
Whe I run the container locally of course I can hit localhost and it proxy_pass me to llmstudio runngin on port 10101. However, when I push this image to runpod. The proxy url runpod provides is indeed an https link.
So the runpod.net https link that was generated gives me horrible Origin errors. Any ideas on how I can adjust my config to work around these errors. Below is the container errors when trying to access the Runpod.net proxy link provided.
H2O LLM Studio is indeed runnign in the container just fine. I just can't use the generated proxy address runpod.net gives me after deploying the pod.
Check comments for my logs
5 Replies
use tcp ports http wont work with h20 studio
Expose ports | RunPod Documentation
Learn to expose your ports.
@Madiator2011 [EU] would I use a tcp port in my nginx config? This container runs perfectly locally even with nvidia contaienr toolkit it sees the GPUs.
Here is what I am passing to runpod.
gpu_count = 2
pod = runpod.create_pod(
name="H2O LLM Studio",
image_name='container/container:latest',
gpu_type_id='NVIDIA A40',
data_center_id="US-KS-3",
cloud_type="SECURE",
gpu_count=gpu_count,
volume_in_gb=150,
container_disk_in_gb=5,
ports="80/http",
volume_mount_path="/data",
)
Do I need to add the tcp port here?
10101/tcp
That didn't work lol. It something to do with the proxy I am trying to add to the h2o container.
ERROR:
2024-02-15T23:37:55.658627043Z 2024/02/15 23:37:55 # {"err":"websocket: request origin not allowed by Upgrader.CheckOrigin","t":"socket_upgrade"}
nginx.conf (Running in the same container as H2O llmstudio)
events {
worker_connections 1024;
}
http {
# Include the default MIME types
include /etc/nginx/mime.types;
default_type application/octet-stream;
# Logging settings
access_log /var/log/nginx/access.log;
error_log /var/log/nginx/error.log;
# Configuration for the default server
server {
listen 80;
auth_basic "Restricted Content";
auth_basic_user_file /etc/nginx/.htpasswd;
location / {
proxy_pass http://127.0.0.1:10101;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
# WebSocket support
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
# proxy_set_header Origin http://127.0.0.1:10101;
}
}
}
@Madiator2011 [EU] @justin Thanks for the tips on the ports. I ultimately got it working using ngrok believe it or not. Ngrok+Runpod? Yes please.
I don't have to EXPOSE any ports in my Dockerfile or in Runpod either. I setup the ngrok client in the container accepting requests to port 10101 passing the --basic-auth flag. Which enables login for h2o llmstudio.
Wooooowww that is pretty smart haha
i only use ngrok for local testing, never thought about it for that situation
Nice~
Yeah I've been using ngrok for awhile as well. Work wanted me to deploy llm studio to runpod for faster training. But they said it must be behind a login at least. I sat on this for 3 days lol. It only took some time away from the keyboard for the lightbulb moment!