[SOLVED] When running a big bulk insert I get an error, MAX_PARAMETERS_EXCEEDED

I want to run a transaction that runs a large bulk insert but I get this error:
Error: MAX_PARAMETERS_EXCEEDED: Max number of parameters (65534) exceeded
at toBuffer (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/connection.js:181:20)
at Object.execute (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/connection.js:167:20)
at Query2.handler2 (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/index.js:256:15)
at Query2.handle (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/query.js:141:65)
Error: MAX_PARAMETERS_EXCEEDED: Max number of parameters (65534) exceeded
at toBuffer (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/connection.js:181:20)
at Object.execute (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/connection.js:167:20)
at Query2.handler2 (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/index.js:256:15)
at Query2.handle (/home/mast/workspace/work/mix-opt/dev/node_modules/.pnpm/postgres@3.3.5/node_modules/postgres/cjs/src/query.js:141:65)
12 Replies
MAST
MAST11mo ago
And by large I mean about 50,000 items each having around 10 parameters. I also tried to put them in arrays of 1000 and then insert but after a few inserts I ended up getting the same error again.
Angelelz
Angelelz11mo ago
Can you do it in a for loop? No driver is going to like you doing db.insert(users).values([array with 50000 items]) But you could do
db.transaction(async tx => {
for (let i = 0; i< array.length; ++i) {
await tx.insert(users).values(array[i]);
}
})
db.transaction(async tx => {
for (let i = 0; i< array.length; ++i) {
await tx.insert(users).values(array[i]);
}
})
MAST
MAST11mo ago
Yeah, I ended up doing this. But can't I do it in like 100 chunks? I feel like it should be doable 😄
Angelelz
Angelelz11mo ago
Sure you can, you could do:
db.transaction(async tx => {
for (let i = 0; i < array.length; i += 100) {
await tx.insert(users).values(array.slice(i, i+100));
}
});
db.transaction(async tx => {
for (let i = 0; i < array.length; i += 100) {
await tx.insert(users).values(array.slice(i, i+100));
}
});
But you gotta be careful with overflowing Maybe you could change it to array.slice(i, Math.min(i+100, array.length))
MAST
MAST11mo ago
Yep, I'll give this a try. Yay, works with batches of 100 😁. Thanks
raghu1819
raghu181911mo ago
how to do bulk update in drizzle orm
MAST
MAST11mo ago
The same way you do a single insert, but instead of passing a single object you pass an array of objects.
raghu1819
raghu181911mo ago
i used pass like that but when i giving target value another column other than primary key getting errors like target vaue must be a primary key
MAST
MAST11mo ago
Can you send the error and how your inserting the data? Preferably in another help thread
kbemaster
kbemaster6mo ago
Is it a good idea to use Promise.all here?
Angelelz
Angelelz6mo ago
Depends how you want to send your queries. With promise.all they will all be sent at the same time. With for await, they will be sent 1 at a time
kbemaster
kbemaster6mo ago
Thx!
Want results from more Discord servers?
Add your server
More Posts