Is there a limit on the createManyCompanies endpoint?

When doing a bulk upload of companies via the UI, the operation fails if there are more than 2000 rows. Does that limitation also apply to the Core Rest API?
10 Replies
ɃØĦɆᵾS
ɃØĦɆᵾS2w ago
Bulk upload is limited by the size of your request in bytes/characters but you can send several requests to bulk create new companies
ɃØĦɆᵾS
ɃØĦɆᵾS2w ago
About 2000 rows limit in UI, there's discussion about it https://github.com/twentyhq/twenty/discussions/6647
GitHub
Onboarding Difficulties · twentyhq twenty · Discussion #6647
There are a couple of issues that I had, trying to onboard the product. I basically stopped now trying, and have to come back later, to see if we can use this product for our company. I thought it ...
basicbass
basicbass2w ago
@ɃØĦɆᵾS you're my hero For what it's worth, I have just shy of 100,000 companies to upload. Hence API > UI for this one!
ɃØĦɆᵾS
ɃØĦɆᵾS2w ago
That's a lot companies to upload ngl Just be aware of 2 things: - define some kind of timeout between each request so you won't kill the server with too many requests at the same time (creating DDoS accidentally is the last thing you probably won't) [I did it rn by mistake :P] - size of body request also matters, while I'm not sure about the size limit, I'd be cautious about it if I were you, especially with the amount you want to upload
basicbass
basicbass2w ago
Thanks for the pointers @ɃØĦɆᵾS. And yeah, lots of orgs right?! Service is applicable to UK charity sector, SMB thru Enterprise, so quite a large target base in absence of an ICP at the moment. There are around 200k charities in the UK so the target list is just a subset, believe it or not. In my case, I'm just running locally, so not concerned about DDoS but definitely going to be spacing out those requests to avoid killing my machine (probably just with a time.sleep() as I'll just quickly script this in python). It seems like 1500 rows is accepted via the UI and given there's not much difference in character count between rows I'll probably just try to do this in batches of 1500 and see how I get on. 🤞
ɃØĦɆᵾS
ɃØĦɆᵾS2w ago
Good luck then, one more tip: if you happen to send a request with too big body and server responds with 500 error code, restart it since it'll lock itself and throw 500 error every time you will send a POST request (I'm not sure if waiting enough time and allowing server to process all of it is a different way of solving this since whenever this happens for me, I simply reset the database)
basicbass
basicbass2w ago
Thanks, will report back! So far so good. I've ended up processing in batches of 500 and adding a 10 second delay to each call. Seems like each call is taking around 3 seconds to complete on my machine and we're up to 20,000 without. To be honest though, if I had thought about it properly rather than rushing through over a coffee, I should have just executed the next batch on receipt of 200, and handled errors gracefully. But I'm impatient and this will (hopefully) be a one off.
basicbass
basicbass2w ago
huzzah. Thanks again for your help @ɃØĦɆᵾS
No description
ɃØĦɆᵾS
ɃØĦɆᵾS2w ago
charles
charles2w ago
We are likely to make this limitation 20k in 0.30 released on Thursday 🙂 I'll ping you once it's out Closing this for now 🙂
Want results from more Discord servers?
Add your server