R
RunPod10mo ago
ACiDGRiM

secure connections

I want to ensure all traffic between my app and the server less backend is encrypted. Does the endpoint decrypt the traffic from the internet and transmit in plaintext to the server less container? Specifically, is the data in my prompt In clear text even in memory before it reaches the container?
Solution:
In theory you could make own worker that input would be encrypted file and it would be decoded on container itself though you would need make that code yourself
Jump to solution
11 Replies
Madiator2011
Madiator201110mo ago
It’s mostly depends how you write your worker. All endpoint use https proxy so traffic is encrypted. Usually you send JSON object to API and that info is used to start the script for the app after job is done serverless do not store input information only the link for output. For full privacy you are advised to use own S3 bucket as otherwise you might get b64 encoded image.
ACiDGRiM
ACiDGRiMOP10mo ago
OK so the best way to obscure my prompt would be to encrypt the message in the api req "message" body and decrypt it in the container? It sounds like the app endpoint handles plaintext json, rather than transparently forwarding a request to the container, TLS intact. Or does the url endpoint determine which container to forward to? In other words, which TLS cert is seen by curl? Api.runpod.io or "my custom cert"
ashleyk
ashleyk10mo ago
You can't use custom certs, it is always the RunPod cert
ACiDGRiM
ACiDGRiMOP10mo ago
OK I thought so. I want to get see embedding of all my personal documents. So I'm just trying to find a way to feel comfortable sending the text to another computer or the internet. I don't mind them being in ram for the inference, but I don't want them exposed between the api and the container. Maybe that's the server option, I'm just trying to save a buck
Solution
Madiator2011
Madiator201110mo ago
In theory you could make own worker that input would be encrypted file and it would be decoded on container itself though you would need make that code yourself
ashleyk
ashleyk10mo ago
The serverless workers are all in secure cloud though and data transmitted over TLS so I don't see any issue.
ACiDGRiM
ACiDGRiMOP9mo ago
OK that's what I'm going to do, but just encrypt the payload. Traffic out of the Contrainer isn't proxied, so I can download the decryption key from my network? I'm sure your secure against most threats but if I'm not sending my documents to Google, I want to limit my exposure to other 3rd parties out of principal until I can afford an l40s at least If I don't control the keys when it's my private I fo it's not secure. You guys have good infra, but I have no idea who you are or who's server the worker is on I'll accept that my files being in vram for a few minutes is acceptable Please confirm this will work, I have a PoC of a feature to send an encrypted body to the run api endpoint, and then decrypt and pass to a typical pyTorch workload, and then encrypt the response and send it If I send the data you have listed in your sync and async endpoints documentation
curl -X POST https://api.runpod.ai/v2/{endpoint_id}/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer ${API_KEY}' \
-d '{"input": {"prompt": "Your prompt"}}
curl -X POST https://api.runpod.ai/v2/{endpoint_id}/run \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer ${API_KEY}' \
-d '{"input": {"prompt": "Your prompt"}}
but with the data/body
{"encrypted": "asdf3wqcm84wmt87v4e7mtasrhcrdgdc"}
{"encrypted": "asdf3wqcm84wmt87v4e7mtasrhcrdgdc"}
will the endpoint forward it to my endpoint ID as is, or do you sanitize for proper prompts? Also for the stream endpoint, the encrypted body will completely change after every returned token is encrypted, this should be transparent when streaming the decrypted original body responce, but I''m not sure if your api will handle that"
Madiator2011
Madiator20119mo ago
I mean you would need to adjust your worker code to handle decryption
ACiDGRiM
ACiDGRiMOP9mo ago
yes, I have a proof of concept that does this, I just want to make sure your api doesn't do any sanitization on the data, just passes it direclty to the worker
ashleyk
ashleyk9mo ago
As long as its valid JSON it should be fine Also the body must be have:
{
"input": ...
}
{
"input": ...
}
Serverless doesn't work without input. So if you want to use encrypted as a key in the JSON, put it inside input.
ACiDGRiM
ACiDGRiMOP9mo ago
Thanks, that's the piece of info I wanted to know. I'll modify my promptProxy to work accordingly
Want results from more Discord servers?
Add your server