CallMeFred
Issue with Browserless template
Hi, maybe you can help me. I have set up in a project the Browserless template but each time I try to run a simple Selenium screenshot capture function,
I get this error even if my variables are properly configured in my web app (in the same project):
browserless.io:server:trace Handling inbound HTTP request on "POST: /webdriver/session" +9m browserless.io:server:error No matching HTTP route handler for "POST: http://0.0.0.0:8080/webdriver/session?launch=%7B%7D"; +9m I'm using the internal endpoint: http://browserless.railway.internal:3001/webdriver
browserless.io:server:trace Handling inbound HTTP request on "POST: /webdriver/session" +9m browserless.io:server:error No matching HTTP route handler for "POST: http://0.0.0.0:8080/webdriver/session?launch=%7B%7D"; +9m I'm using the internal endpoint: http://browserless.railway.internal:3001/webdriver
18 replies
Pricing Question
I'm building a Python app to generate videos.
The early tests show that each video generation would require 600Mb of memory and roughly 4% usage of a 2.3GHz processor.
What would be the cost to generate one video on Railway?
10 replies
Very weird "bug" - Celery (Redis)
Hi, I experience a very weird bug.
I deployed a FLASK web app, with as main .py files
main.py, at the root of the project, with most of the code, incl. a function which triggers a celery task located in tasks.py
celery_config.py (with the Celery configuration)
tasks.py, where I have my celery task
I have 2 services in my project
- a Redis database
- the Flask Web App
When I trigger in my web app the action which is supposed to launch the celery task, I see an entry (celery) in Redis, which proves that the web app could communicate with Redis, but the task doesn't actually start.
But then when I spin Celery in my local development instance (using the same start command as the one in my Procfile), on my computer, the task is actually triggered and it's completed.
The problem is that the output file which was supposed to be generated and stored in the cloud by the web app is then stored on my local machine, since the process actually completes on my local machine, not in the web app....
In my Procfile (for the Flask web app deployment), I inserted the 2 lines which are supposed to both spin the server for the web app and the celery service.
web: gunicorn main:app
worker: celery -A celery_config worker --loglevel=info
What am I missing?
Why isn't Celery working as expected in the cloud version of my app?
It works fine locally (connecting to the same Railway Redis database as a broker, via the same URL).
136 replies