Migration to seed huge amount of data
Hi, are there any guides or suggestions how to effectively seed huge amount of data with migrations? We have the data in separate json (or csv) files which are copied to "assets" folder in the dist folder. How do you do this?
3 Replies
Hey 👋
This is very different between different databases. What dialect are you using?
postgresql
the naive solution off the top:
1. Use a CSV.
2. use any of the CSV parsing + streaming Node.js solutions out there.
3. chunk many records together and send them over using a classic insert query OR with data as select <tuples_from_data> insert into ... select * from data ...
also there's this.
https://www.postgresql.org/docs/current/populate.html#POPULATE-COPY-FROM
https://www.postgresql.org/docs/current/sql-copy.html
should be faster.