P
Prisma6d ago
Elias

"upsertMany" with transaction timing out

I'm trying to build a database containing stock for different products from our suppliers. The stock can be fetched as a csv file from an FTP server. What I want to do is to periodically (every day) run a script that gets the csv file, parses it, and insert the values to a database. It should also save the previous history of the stock. This is the code that I have currently.
await client
.$transaction(
async (tx) =>
await Promise.all(
records.map(([reference, quantity, _]) =>
tx.part.upsert({
where: { reference },
update: {
quantity: parseInt(quantity),
history: {
create: { date: file.date, quantity: parseInt(quantity) },
},
},
create: {
reference,
quantity: parseInt(quantity),
supplier: { connect: { id: info.id } },
history: {
create: { date: file.date, quantity: parseInt(quantity) },
},
},
})
)
),
{
timeout: 300000,
}
)
.catch((err) => {
console.error(err)
})
await client
.$transaction(
async (tx) =>
await Promise.all(
records.map(([reference, quantity, _]) =>
tx.part.upsert({
where: { reference },
update: {
quantity: parseInt(quantity),
history: {
create: { date: file.date, quantity: parseInt(quantity) },
},
},
create: {
reference,
quantity: parseInt(quantity),
supplier: { connect: { id: info.id } },
history: {
create: { date: file.date, quantity: parseInt(quantity) },
},
},
})
)
),
{
timeout: 300000,
}
)
.catch((err) => {
console.error(err)
})
The csv contains almost 5000 products/entries. This times the transaction query out. Is there a better and more efficient way of doing this?
Solution:
I switched to postgresql and now everything works as intended and a lot quicker!
Jump to solution
2 Replies
Ankur Datta
Ankur Datta6d ago
Hey there @Elias 👋🏾 , A simple solution would be to break down the request into smaller chunks. For example, handle 100 records at a time. On top of that, to further improve performance, you could then use a library like Parallel.js to divide processing to separate processes. For example: Have 5 processes (depends on the number of cores you have) each handling 100 requests. https://www.npmjs.com/package/paralleljs
npm
paralleljs
parallel.js enables easy multi-thread processing in javascript. Latest version: 1.1.0, last published: 4 years ago. Start using paralleljs in your project by running npm i paralleljs. There are 24 other projects in the npm registry using paralleljs.
Solution
Elias
Elias5d ago
I switched to postgresql and now everything works as intended and a lot quicker!
Want results from more Discord servers?
Add your server