austinm
'The information you’re about to submit is not secure'
I have a very simple fastAPI page that I just began hosting through github on railway. It has a text file upload form. When the form is submitted, the page shows:
The information you’re about to submit is not secure
Because this form is being submitted using a connection that’s not secure, your information will be visible to others.
I can click 'send anyway' and the site continues as intended, I just don't want that message popping up for users. I have cleared cache, and tried on other devices.
22 replies
Trouble using websockets with Flask
I am trying to host a flask site that uses websockets. After many attempts of installing different packages, I seem to have hit a dead end with this error:
_proto_tcp = socket.getprotobyname('tcp')
OSError: protocol not found
This seems to be the solution when hosting locally:
Yes, you should have a /etc/protocols file. It must have been deleted somehow. It comes from the netbase package.
This should reinstall it:
sudo apt-get -o Dpkg::Options::="--force-confmiss" install --reinstall netbase
But as I am working with railway, i dont have access to that method. Any help?
6 replies
Streaming response to client
I have an api hosted on Railways that uses fastapi and websockets to talk to the client. The output is a streaming response such as this:
for bot_response in return_message: await websocket.send_text(f"{bot_response}") This works fine when hosted locally, the client side receives the response as it is generated. However, when hosted on railway, the response is released all at once. I can't tell if this is just a latency issue or if there are some configurations I can change, because the response does send the individual text segments, it just waits a while and then sends them all very quickly.
for bot_response in return_message: await websocket.send_text(f"{bot_response}") This works fine when hosted locally, the client side receives the response as it is generated. However, when hosted on railway, the response is released all at once. I can't tell if this is just a latency issue or if there are some configurations I can change, because the response does send the individual text segments, it just waits a while and then sends them all very quickly.
28 replies