Decompressing large .gz file

Hey there, I'm wanting to decompress a large .csv.gz file (~1.3GB) pulled from a remote origin inside a worker Then I need to load and iterate over the rows in the decompressed CSV (~4.1GB) and add them to a D1 DB. Any help would be greatly appreciated! Thanks, Artu ❤️
3 Replies
Walshy
Walshy4mo ago
So you'd want https://developer.mozilla.org/en-US/docs/Web/API/DecompressionStream to decompress Then to parse and go through the csv, any regular node lib should work
MDN Web Docs
DecompressionStream - Web APIs | MDN
The DecompressionStream interface of the Compression Streams API is an API for decompressing a stream of data.
Jürgen Leschner
Hi @Artucuno, if you run into subrequest limits when streaming and calling the D1 client api from your worker, you might consider using D1 import which can upload and then import larger SQL files. https://developers.cloudflare.com/api/operations/cloudflare-d1-import-database
Cloudflare API Documentation
Interact with Cloudflare's products and services via the Cloudflare API
Artucuno
ArtucunoOP4mo ago
Thanks!
Want results from more Discord servers?
Add your server