mongo instance started returning pool destroyed
hello, i've had this instance running for a while now and noticed its down. i saw a spike in traffic so i am guessing it is due to that. i tried to restart but it said there was an error trying to restart. any guidance here as i have users trying to access.
thanks!
45 Replies
Project ID:
N/A
i can provide account info but i opted not to since i had to connect my discord account to my railway account so im assuming that info is readily available to you
projectId
3b45eaa5-6a58-4802-82d4-42068fc9c5d7
whats more concerning is i wasnt even alerted(maybe i have it off?)so your app crashed and wasn't restarted?
i dont even know?
all i know is i see pool destroyed in any query. and when i try to restart it gives me a generic error
i can try again
can you connect to the mongo database with something like dbgate?
okay will bring this up to the team
Flagging this thread. A team member will be with you shortly.
is there a timefram on when i should hear back about this?
now
o/
Interesting
i dont like the sound of that haha
Well, I can connect to it
hmmm
Is it supposed to have data in it?
yeah
when i try to query from it. a user reported that and thats when i was like o.O
its been running smoothly for like a year id be able to restart and call it a day
Does your / route check health of the DB?
Cause, from what I can see, I can connect to that DB using both the UI and mongosh
https://narutoql.up.railway.app
nah
let me verify though
oh yeah it def connects to db
idk what pool destroyed means
or why i cant restart the instance. so two weird thingies
Mind if I give it a go?
yeah!
(Restarting the Mongo)
Odd. Was able to just fine. I'll have to see if there's a permissions error or something for the API
Now, exactly wtf a pool destroyed error is (could be related to connection closing?) is I'm not sure but that's defs an application level thing
Added a ticket to look into the permissions stuff tmo
interesting. it’s been running stable for a long time you actually see data on the instance?
sorry not a computer. i’ll take a look tonight
Would it be under the collection
test
or something else?yeah probably test
I'm not seeing anything in the UI nor when I dial it?
well pooper scooper
Does it have an endpoint to delete the data or something? You said it's supposed to have data there :/
Only time we've seen dataloss has sadly been if someone nukes this stuff
yeah its just read only data
rip
something is up besides that since theres a pool destroyed message, something is closing it on my end all good
Uhh, that's super odd. I haven't seen anything like this before
before this happened a user told me they were executing a lot of queries then it happened so maybe the load or something nuked it and other things
all good ill just tear it down and stand another one up
Well I mean, it's more for me I'm confused how this would happen if it's read only...
I've thrown some stupid fkn load at these plugins and they've been totally fine
yeah i wish i knew tbh
Well that's odd. Like, just to clarify, there's like no endpoint in there that would just delete the collection?
Oh L, did you just nuke the instance?
(Totally fine but I was about to ask if I could export logs to see if there was a delete in there)
fk i totally did -_-
yeah all i have is queries: https://github.com/bautistaaa/narutoQL/tree/main/src/server/graphql
forgive me i dont ever touch this thing
nahnah all g
dumb q, its been a year since i touched this stuff, i spun up a new mongo and seeded it
do i need to refresh the node app with the new env vars somehow or is that automagic
i am redeploying now looks like the env vars look diff so that seems legit
as long as you have used variable references, youd just need to trigger a redeploy, either by pushing to git or through the 3 dot menu
hmm somethings is borked, all g i figure it out
ill leave this here 🙂
https://docs.railway.app/develop/variables#reference-variables
back in business
🙂
crappy part before is i couldnt see any logs cause i deployed it like beginning of 2022 so maybe illl have more insight next time around!