"upsertMany" with transaction timing out
I'm trying to build a database containing stock for different products from our suppliers. The stock can be fetched as a csv file from an FTP server. What I want to do is to periodically (every day) run a script that gets the csv file, parses it, and insert the values to a database. It should also save the previous history of the stock.
This is the code that I have currently.
The csv contains almost 5000 products/entries. This times the transaction query out.
Is there a better and more efficient way of doing this?
Solution:Jump to solution
I switched to postgresql and now everything works as intended and a lot quicker!
2 Replies
Hey there @Elias 👋🏾 ,
A simple solution would be to break down the request into smaller chunks. For example, handle 100 records at a time.
On top of that, to further improve performance, you could then use a library like Parallel.js to divide processing to separate processes. For example: Have 5 processes (depends on the number of cores you have) each handling 100 requests. https://www.npmjs.com/package/paralleljs
npm
paralleljs
parallel.js enables easy multi-thread processing in javascript. Latest version: 1.1.0, last published: 4 years ago. Start using paralleljs in your project by running
npm i paralleljs
. There are 24 other projects in the npm registry using paralleljs.Solution
I switched to postgresql and now everything works as intended and a lot quicker!