I'm trying to import data from an sql
I'm trying to import data from an sql file trough the http api but getting
statement too long: SQLITE_TOOBIG
error. The whole sql file is 24mb and has one insert statement per line (each statement ends with a semicolon followed by a line break). The table has 13 columns I'm inserting into. The longer insert statement line is 751 byte. Are there any limitations for the D1 HTTP API I'm not aware of? Could this be a bug? Or I'm maybe doing something wrong?
Update: Problem was with quotes not been escaped. Escaped those by doubling them and the problem was solved!10 Replies
Hey! Would this be helpful by any chance?
https://developers.cloudflare.com/d1/build-with-d1/import-export-data/#resolve-statement-too-long-error
Cloudflare Docs
Import and export data | Cloudflare D1 docs
D1 allows you to import existing SQLite tables and their data directly, enabling you to migrate existing data into D1 quickly and easily. This can be useful when migrating applications to use Workers and D1, or when you want to prototype a schema locally before importing it to your D1 database(s).
Not really since I'm already doing only one independent insert with a single value set per statement!
@fry69 Here's a sample of the file I'm importing. The real file has 55,000 similar lines with 24mb total size, but I picked only 3 lines in the sample including the longest one.
And not sure if this could be important too, but the db has some triggers that would update an fts5 virtual table on inserts.
depends what you mean by 'import data via the api' exactly
odd, id try the wrangler cli next
I'll be doing that periodically and for multiple Dbs/datasets, that's why I opted for the http api for automating things in a node script...
BTW reducing the import file to only 40k lines (17mb) rather than 55k lines (24mb) seems to work fine.
Update: I was using the wrong checksum, even 40k lines is failing.
could also be dirty data in those last 15k lines, something not escaped