How can I do a bulk data load into my MySQL database?
That's close to 1 million insertions into a table in my database.
25 Replies
Project ID:
N/A
the same way you would for any other mysql database
N/A
The problem is that I'm using MySQL Workbench to connect to my database hosted on Railway, and I'm using the 'Table Data Import Wizard,' but it's taking too long. Is there a more efficient way to load the data into my table with better performance?
that's how you would want to do it, do you have slower internet?
I don't think my internet is bad, but it's not very fast either. I have a 50 Mbps connection
whats your upload?
12 Mbps
could take a while, just let it go and I'm sure it will finish
The problem is that I've made estimates and it will take 3 days to upload 252,000 records, and I have 1 million records, so this isn't an optimal path for me...
I've been researching and testing 'LOAD DATA INFILE,' and it has extraordinary performance, inserting 252,000 records into my table in less than a second. Is there a way to do this with a Railway database?
The problem is that I tested it locally, but when I try to do it with the Railway database, I get the following error: ErrorCode: 1290. The MySQL server is running with the --secure-file-priv option so it cannot execute this statement
the bottleneck here is your internet speed
Would the performance improve with a better internet connection, so there wouldn't be a need to use 'LOAD DATA INFILE'?
correct, the internet speed is the issue
Perfect, I'll try it with a faster internet connection and get back with updates. Thank you very much.
no problem!
Try upload the file into the same context from your database in railway, and using cli try exec command in terminal, maybe it resolve your problem
@Garcia How can I upload my CSV file to where my database is?
Instead of keeping your machine running all day to import the data.
You can run a background worker to import the data. That would be faster and make things easier for you.
@Brody Hi, I have some updates. I tried with a 1Gbps internet connection, and it did improve the performance, but it still takes around 30 hours (1 day and 6 hours) to load just 252,000 records, and I can't wait that long since I have 4 files...
@Yashu Sorry for my ignorance, but I don't know how to do that. Would it be too much trouble to provide a document or something similar to help me create it? I'm not very familiar with the term.
what is the actual size of the SQL file
Definition
Background worker is simple independent threads in the application running in the background. Generally, they run periodically to perform some tasks.
You can store/upload your sql/csv file to cloudflare r2 or aws s3. Then write the program in which ever language you're comfortable in (JS, Ruby, Python, Go).
The program can read the file data from the R2/S3 bucket (you can further split the files and run multiple processes) and then insert the data into db.
I have 4 CSV files, each weighing approximately 3MB
why would that take over a day, something is being done wrong
how are you importing them to the database
Literally, I'm doing it with the 'Table Data Import Wizard,' selecting the CSV file, matching the columns of my file with the columns of my table, and executing it.
please look for alternative ways to import CSV files