My Postgres is using too much memory for days now. Please help
Postgres is not releasing memory for days now. I'm not sure what is happening. Please help
Project ID is : 6d0f799e-be59-4388-899d-f00456f30667
ENV: Production
29 Replies
Project ID:
6d0f799e-be59-4388-899d-f00456f30667
@angelo - postgre 8gb of memory usage
in the mean time, backup the database and restore it to a neon.tech database
Hi @Brody I did not understand. I created an s3 backup service.
This month's bill came in very high. It was stupid of me to not check the memory usage of the db. I had an issue with redis earlier. I believed it was causing this.
Never thought postgres will be such a memory hog
Adding to my queue today
Flagging this thread. A team member will be with you shortly.
Thank you. Im waiting for a reply
you still want to look into moving your database to neon.tech, as there's nothing stopping this from happening again in the future
email?
I will perform a partial refund in the meantime
But its likely DB mem creep
[email protected] @angelo
@Angelo Hi?
have you done what I suggested?
No @Brody Im new to neon.tech db. I would like to read about this before I do. Also I'm not sure how to migrate data properly.
My railway stopped auto deploying because of no payment. Please perform a partial refund and if possible, also help me with a temporary solution so this memory is freed. Its still running on 7-8 gb
cc: @Angelo
I want to try restart the db but I do not know what I'm doing
"Restarting a database will result in a short amount of downtime. Only do this if you know what you are doing or are facing issues with your database."
@Brody did you mean I should sign up for neon.tech and use their hosted db?
yes
@Angelo any update?
a restart does not delete data
go ahead and restart
wait- for some reason Plain didn't send the message....
I credited your account $20, however, it seems that the memory usage is legit, you have a lot of data in there
@KiBender - here: Hey there, so really dug into this one and gave your projects the time that it deserved. After looking through your DB, I would say that the memory usage matches the expected behavior of what we seen on the platform. Although the climb in cost is regrettable, Postgres will cache any operations as long as there is memory for it. I have credited your account $20 to make up for the overage. With that said; the upcoming cost controls that we have next quarter will be the root fix for us so you can define spend ahead of time. Sorry its not the answer you were expecting but I hope the credits help. - Angelo
@Angelo Thank you, I was really confused here. I agree there is some data there. Let me see what I can do with those data. Thank you so much. 🙂
I thought it was a railway issue because of the way a shared server works
No, its just MySQL being scummy
Do you have any suggestions where I can save data that I won't update. Just read and write. I was looking at clickhouse db but that is for time data I believe?
@Angelo is this possible with railway? to change shared_buffers
https://www.postgresql.org/docs/9.1/runtime-config-resource.html
PostgreSQL Documentation
Resource Consumption
I'm not able to update it. I feel this will fix my issue @Angelo
Easy on the tags
and I am not a DB expert so I don't think I can confifure this for you
If you see the screenshot above, you can see the memory usage is excatly the same as how it is setup in shared_bufffers
Can you help me reduce it to 4gb atleast
Sorry. I don't want to setup an entirely new database just because of a postgres config.
Yep- however in this case, I'd assume that with the data you have and the usage, this is an okay, not optimal amount of usage.
Understood. but can you tell me is it's possible to change the shared_buffers
not from your end, no
Will you be able to do it for me?
sorry thats likely not something the team would do
once again, if you arent happy with railways databases you can always try out neon.tech
you are not locked in to only use railways databases