Unable to import huge CSV (500MB+).

I've been increasing my allowed memory limit, execution time, and all but still stuck while trying to import large data.
Allowed memory size of 12884901888 bytes exhausted (tried to allocate 2174129888 bytes)
Allowed memory size of 12884901888 bytes exhausted (tried to allocate 2174129888 bytes)
No description
8 Replies
toeknee
toeknee2w ago
Adjust the chunk size: https://filamentphp.com/docs/3.x/actions/prebuilt-actions/import#changing-the-import-chunk-size I suspect your chucking is too high and each chunk hits the limitations of the sever
Yuvraj Timalsina
Yuvraj TimalsinaOP2w ago
I adjusted chunks as well but since data is too large, it consumes too much of memory and crashes.
toeknee
toeknee2w ago
Try reducing down to 50 chunks, this shouldn't use a lot of memory but will take a litle longer. The reason being that by using chunkSize Dan built it so that we use the league fgetcsv which means we can read directly from the file opposed to loading the whole file into memory are you using mac/linux to import?
Yuvraj Timalsina
Yuvraj TimalsinaOP2w ago
Using linux
toeknee
toeknee2w ago
How'd you get on at 50?
Yuvraj Timalsina
Yuvraj TimalsinaOP2w ago
My IDE crashed at even 5, Do we have lazycollection for here?
toeknee
toeknee2w ago
Your IDE crashed? Why are you loading it in your IDE? Theres no need for lazy collection. It is using chunksize, which selects the first X records, it's natively part of the leauge csv handler which is huge and highly optimised.
Takashi
Takashi2w ago
I think we can solve this problem in several ways
Want results from more Discord servers?
Add your server