R
Railway•2y ago
Railay

Postgres not releasing memory, maxed at 8GB for over an hour

Hi, I've got a decently large Postgres DB.
I was trying to run some queries to alter the table structure, but they never completed. I canceled the queries and after a long wait, had to manually kill the processes. Postgres is still maxed out at 8GB of memory. CPU is near 0%. Been like this for over an hour now. How can I clear this up? I cannot lose the data in the database.
11 Replies
Railay
Railay•2y ago
Actually, looking at my metrics, it's been at 8GB for the better part of 3 days. I've been migrating a lot of data from external sources, so it makes sense that there's some very high RAM usage.
But there are periods where it drops to a few hundred megabytes... so I know it doesn't need that much memory to run. I'm also running this exact environment on my local and it's not consuming anywhere near 8GBs of memory. How can I solve this? It's racking up my hosting usage charges Hi @Angelo could you or someone at Railway chime in here? It's racking up quite a bill, and I don't think Postgres should be maxing out like this. I don't think I have any way to address this
angelo
angelo•2y ago
Project ID? Keep in mind, Postgres looks at open RAM and will tune its self to consume that memory.
Railay
Railay•2y ago
@Angelo But if that's the case, why would the instance on my local, with the exact same data and processes, be consuming less than 300mb? 34108dc5-7795-469f-b853-2863bbedfa6f Thx! 🙂
angelo
angelo•2y ago
Docker Daemon funnyness- its not great. Anyways, lemme look deeper into this. @Railay - so I restarted your DB that should help but also after looking at your logs, apparently its trying to make connections and is thrashing. Is your code asking the DB to ingest something it can't find? In additon, in this instance, you have quite a bit of records, so the way that we have Timescale tuned, its going to optimistically cache that data
Railay
Railay•2y ago
@Angelo thx for the feedback! I'll look into the connection issue... something in the connection settings must not be shutting down properly. Can I disable timescale with something like DROP EXTENSION timescaledb; ?
The table has a lot of records, but the query isn't fancy and the one query on a big table isn't time sensitive... the results are stored in an application state and updated at 1 minute intervals. So it doesn't matter to my users if the query takes 100ms or a few seconds to run.
Railay
Railay•2y ago
@Angelo actually, your docs say timescale isn’t enabled by default. I didn’t turn it on (as far as I know). Do you see it running? Or are the docs out of date? https://docs.railway.app/databases/postgresql
Railway Docs
PostgreSQL | Railway Docs
Documentation for Railway
Railay
Railay•2y ago
@Angelo any thoughts on this? If I can’t get the cost down, then I need to move the DB away from railway 😢 I'm also not able to pg_dump my database. I confirmed I'm using all the right connection settings, but I get the following error:
pg_dump: error: connection to database "railway" failed: could not connect to server: Connection refused
Is the server running on host "containers-us-west-XX.railway.app" (35.XXX.XXX.XX) and accepting
TCP/IP connections on port XXXX?
pg_dump: error: connection to database "railway" failed: could not connect to server: Connection refused
Is the server running on host "containers-us-west-XX.railway.app" (35.XXX.XXX.XX) and accepting
TCP/IP connections on port XXXX?
Pirhoo
Pirhoo•16mo ago
Hello, I'm also having an abnormal memory usage on my Postgresql database Any chance you can restart it for me please? plugin id is 415bbab4-5a16-4e9a-96e4-c414a378b854 project id is 51efbe4c-27d0-456d-950c-1648dbea6e8f
Pirhoo
Pirhoo•16mo ago
Metrics are beyond the roof (and got even higher after I tried to garbage-collect the database)
Brody
Brody•16mo ago
open your own thread please
Pirhoo
Pirhoo•16mo ago
sure