Unable to import huge CSV (500MB+).
I've been increasing my allowed memory limit, execution time, and all but still stuck while trying to import large data.
8 Replies
Adjust the chunk size:
https://filamentphp.com/docs/3.x/actions/prebuilt-actions/import#changing-the-import-chunk-size
I suspect your chucking is too high and each chunk hits the limitations of the sever
I adjusted chunks as well but since data is too large, it consumes too much of memory and crashes.
Try reducing down to 50 chunks, this shouldn't use a lot of memory but will take a litle longer. The reason being that by using chunkSize Dan built it so that we use the league fgetcsv which means we can read directly from the file opposed to loading the whole file into memory
are you using mac/linux to import?
Using linux
How'd you get on at 50?
My IDE crashed at even 5, Do we have lazycollection for here?
Your IDE crashed? Why are you loading it in your IDE?
Theres no need for lazy collection. It is using chunksize, which selects the first X records, it's natively part of the leauge csv handler which is huge and highly optimised.
I think we can solve this problem in several ways