I'm trying to import data from an sql

I'm trying to import data from an sql file trough the http api but getting statement too long: SQLITE_TOOBIG error. The whole sql file is 24mb and has one insert statement per line (each statement ends with a semicolon followed by a line break). The table has 13 columns I'm inserting into. The longer insert statement line is 751 byte. Are there any limitations for the D1 HTTP API I'm not aware of? Could this be a bug? Or I'm maybe doing something wrong? Update: Problem was with quotes not been escaped. Escaped those by doubling them and the problem was solved!
10 Replies
Jun
Jun2mo ago
Cloudflare Docs
Import and export data | Cloudflare D1 docs
D1 allows you to import existing SQLite tables and their data directly, enabling you to migrate existing data into D1 quickly and easily. This can be useful when migrating applications to use Workers and D1, or when you want to prototype a schema locally before importing it to your D1 database(s).
naourass
naourassOP2mo ago
Not really since I'm already doing only one independent insert with a single value set per statement!
naourass
naourassOP2mo ago
@fry69 Here's a sample of the file I'm importing. The real file has 55,000 similar lines with 24mb total size, but I picked only 3 lines in the sample including the longest one.
naourass
naourassOP2mo ago
And not sure if this could be important too, but the db has some triggers that would update an fts5 virtual table on inserts.
Ahz
Ahz2mo ago
depends what you mean by 'import data via the api' exactly
Ahz
Ahz2mo ago
odd, id try the wrangler cli next
naourass
naourassOP2mo ago
I'll be doing that periodically and for multiple Dbs/datasets, that's why I opted for the http api for automating things in a node script... BTW reducing the import file to only 40k lines (17mb) rather than 55k lines (24mb) seems to work fine. Update: I was using the wrong checksum, even 40k lines is failing.
Ahz
Ahz2mo ago
could also be dirty data in those last 15k lines, something not escaped
naourass
naourassOP2mo ago
Tried both first and second 40k offsets, both of them import successfully! Looks like it's something to do with some limitation... After correcting the checksum and successfully imported a 200kb sql file with 500 insert statements, looks like only 344 records were inserted... I rechecked the file and it indeed has 500 insert statements lines, and checked the records count of the table in both dashboard ui and using wrangler execute count command. You are probably right, I'll be inspecting further... @Ahz / Grimlor You were absolutely right, there was quotes that need escaping !
Want results from more Discord servers?
Add your server