Unable to connect Prisma to RDS database
I am deploying a beta version of my webapp to prod however I am unable to connect to my RDS db. I have tried multiple different engines (Aurora Mysql, Mysql 8.0.35, MariaDB 10.11.6)
I keep getting the error "PrismaClientInitializationError: Authentication failed against database server at
db-host
, the provided database credentials for db-user
are not valid"
The credentials are definitely correct and I have tested connecting to the database without Prisma and it works perfectly fine.
Any guidance would be greatly appreciated as I have been troubleshooting this issue for 3 days with no luck.
Thanks in advance71 Replies
Just to add to that I was at one point getting an error about the authentication plugin however I havent been able to recreate this.
@RaphaelEtim Hey dude are you able to help on this matter, appologies If i shouldn't of @ you im not sure what the
etiquette is.
Do you have special characters in any part of your connection url? If yes, you would need to percent encode them as shown in this part of the documentation
Connection URLs (Reference) | Prisma Documentation
Learn about the format and syntax Prisma ORM uses for defining database connection URLs for PostgreSQL, MySQL and SQLite.
Randomly generated passwords with special characters is what got me for a bit, went straight alphanumeric and it solved that problem for me.
I do not have any special characters in the password as I read something about needing to escape out special chars so for testing purpose I have been using 1175ZE90ss3Kv7weCXsWxOJphj7AJ0REb7
I will try using a non random generated one and see if that helps
Nah, yours should be fine, hrm
I'm hesitant to ask for your connection string since that'd expose the entire db, but one other problem I had was a copy/paste error that messed with the separator between the user/password
The easiest way to debug that imo is to copy/paste the string into the postgres, or whatever, command line tool
Hmm yeah I had a feeling that could be an issue where ive been copying from password manager
Ive tired hand typed password and connection string and neither worked unfortunately
I'm gonna be switching jobs, aka rigs, here shortly, I'll come back to this and see what I can think of
I don't have a prisma project for the one I'm working on currently
Much appreciated
postgresql://DB_USER:DB_PASS@DB_HOST:DB_PORT/DB_NAME
mine conn string has
?schema=public
at the end, but psql won't take it, copy everything of yours and issue
psql "WHOLE_CONN_STRING"
and see if you can connectHere are some of the variations ive tried I have shortened the hostname as to not expose the db
Im unsure how to replicate this with mysql
the cli for mysql and psql are very close have you tried?
mysql "mysql://user:[email protected]:3306/v4beta"
unfortunatley I don't have mysql locally, or any remotes, to test this
it should work actually
https://dev.mysql.com/doc/refman/8.4/en/connecting-using-uri-or-key-value-pairs.html#connecting-using-urican you please enable debugging to get more detailed logs.
https://www.prisma.io/docs/orm/prisma-client/debugging-and-troubleshooting/debugging#setting-the-debug-environment-variable
Debugging (Reference) | Prisma Documentation
This page explains how to enable debugging output for Prisma Client by setting the
DEBUG
environment variable.Yeah thats what i did but it said it couldnt connect to the local mysql soc
Okay thanks ill try this today and put an update in here
I added DEBUG="prisma*" to my enviroment variables and it didnt change the output of the log
Can you please share the log here?
Ive had a call with AWS support and they have confirmed its not a problem on there end as I am able to connect to the database using heidisql
From the research Ive been doing it seems like this is a common problem with deploying prisma to AWS however none of the fixes have worked for me
Im going to try setting up an RDS proxy and see if that helps at all and will update with my findings
Since you can connect with something like dbeaver locally, or mysql/psql command line after ssh'ing into your server rig, and the credentials are gtg I'm at a loss here
I had suspicions that it could be to do with root certificates but when I looked into theres no way Heidi would have access to the root cert and that works fine
For this deployment RDS is not crucial so if need be I can just setup and host a mysql server but I was planning to upgrade it to high availability in a few months once I start onboarding customers so using RDS out the gate would make this planned upgrade a hell of alot easier
There really shouldn't be a valid reason why you can't connect from say an ec2 to rds though.
I can connect to RDS from EC2 just not using prisma apparently
All my testing with HeidiSQL has been using the EC2 instance as a ssh tunnel as the db isnt exposed to the internet
I mean, as much as I hate what I'm about to write, technically it is connecting... just having the connection request kicked back
Well yeah your not wrong its finding the database server
Yeah, but that's fundamentally using a vpn
Im gonna go try find the log from the db side to see if it has anyththing usefull
Well that wasnt very insightfull
rarely is sadly
Tried updating to latest prisma version but no luck gonna try RDS proxy now but I cant see that making a difference
Ah apparently I cant use RDS proxy unless im in multiple AZs
Wish I was using AWS for this project I'm on, we're using DigitalOcean currently
My other gig uses AWS, but I can't randomly spin up some rigs to test right now
Its all good man I know how it is I appreciate all the help thus far It has helped me rule out a bunch of stuff
Im going to try hard coding the endpoint into the prisma schema to rule out portainer/docker
Hardcoding it worked so this is an issue with how portainer or docker handle enviroment variables
Should I mark this as solved for If anyone has the same issue and update the title
I mean, if you want some help troubleshooting your dockerfile I can probably help
I don't use portainer for anything though
That would be great as only reason I was using portainer for the env was I was havving issues when trying to load one from a file
What CI/CD platform are you using?
github, bitbucket? something like that
Erm Im not yet very small company and solo dev havent had a chance to set one up
The aim is to use ECS in the long run but for now im just uploading via sftp
Well it doesn't really matter as this is probably an issue building your container anyway
So I store my env var key/value pairs in bitbucket to be injected into the container
Im not actually building an image im using docker-compose with volumes
Although I will have to build an image for the final deployment but while its in beta I was going to just use volumes so i can do hot fixes etc
well I have a compose file with env vars, I'll drop a snippet
These things shouldn't be committed anyway imo, at least I don't for that thing
Im also trying to stay in the aws ecosystem as much as possible to keep costs down because I managed to get aws activate for startups
that's a docker-compose.yaml file
you can also get a similar program from GCP fyi, I'm migrating over for this other job
Yeah Ill give that ago only reason I wasnt was so it could be commited but till I have more devs there no real need
bitbucket has a more than good enough free tier imo to do this "correctly" now
Im not sure how i feel about GCP after the whole deleting all of a companys data thing
3 users, usually more than enough build minutes
Oh cool ill check it out
well I watched my last company burn to the ground when us-east1 screwed off one day on a friday, they all have problems
Ah thats fair yeah
if you maintain db backups, off site ones, it is what it is
bitbucket is like github, I just don't use github for much right now so can't be super helpful with their ci/cd platform
Ideally id love to do on premises but were stuck behind cgnat
bitbucket can build your containers, shove it off to ECR, then you use elastic beanstalk to deploy the container
on premise 🤮
For backups
ahh, just need a decent local cron task
take a backup, occasionally stand it up to validate it works, shove it into google drive or something
while maintaining some images on rds
rds images are annoying imo as I never found a way to use them off rds
Yeah that should be simple enough and can finally but the gdrive space to good use 😂
Yeah thats sounds about right
Dockerfile
Whats the difference between args and envs?
DATABASE_URL, etc, you'll store in bitbucket
Oh i get you
oh hell if I know, I just know that I had problems not using them both
so now I use them both lol
So i think you pass an arg to the image and the arg goes into the env perhaps
in bitbucket, you'll have a bitbucket-pipelines.yml file, you'll issue your docker build command there
docker build --build-arg DATABASE_URL="${DATABASE_URL}"
... one for every arg, and make damn sure you have a space for multi lineMy jr that set that particular api up used two dockerfiles
anyway, when you get there hit me up and we can do/schedule a google hangout or something
Much appreciated Ill drop you a message in the next couple days once Ive had a chance to get things ready
un-fun pro-tip: make sure you have "delete eol white space on save/formatting/etc" turned
OFF
for at least yaml files
or you will loose days of your lifeThanks for the heads up