Bulk Actions Large Datasets
When trying to execute bulk actions on a dataset of 200 user models, i get memory issues.
For example when I try to export more than 200 user models using the ExportBulkAction, my application freezes. is there anyway to optimize this?
Am I doing something wrong? exporting 400 users doesnt sound like it should be an issue.
21 Replies
Are you using Filamentphp Exporter or a custom bulk action? If it's a custom bulk action I would suggest you look at the bullk action queues system.
Im using the Filament Exporter, but the action isnt even being reached. It feels like the action first tries to get all the selected records in memory, and then execute the action. But the getting the models in memory part is where it freezes. Any bulk action with this amount of records freezes
The Filament exporter run's queues, so are you even getting the queues
Like i said, the exporter isnt even being reached. This isnt an exporter issue, its a bulk action issue. The exporter was an example. If i make a custom Action that just updates a column, or sends and email for each record, the same issue occures
Have you got the debugbar on?
the 400 records example was with the debugbar turned off. With it turned on it was even less.
Filamentphp uses livewire and livewire sends the whole table across
Yes, i know. So are you saying that a bulk action on more records is impossible using filament?
It should suffice when not showing more than 100 results, and bulk actioning it on by 'selecting all records'.
Else use a table header action
When i show 50 records, and click "select all record" and then do the bulk action, it freezes. Im not showing all records on the table, if thats what you thought sorry.
This is what im doing.
Showing 50 records, the table has 400 in total, select 1 page, click "select all records", click the bulk actions -> application freezes
Is this on a slow computer or a live server? I've just generated 585 PDF's from selecting 585 results in a bulk action, showing 25, it took a few seconds.
No, they both should be able to handle this i believe
What is your memory limit?
feck... get Telescope installed and view the request as to whats using so much data.
I just did, as soon as i hit the bulk action button, the application froze, Telescope included π
Alright, tried again, application froze, but Telescope didnt register a request
Can you provide your resource cdoe
of what exactly?
Hi @Remi Hindriks
did you manage to solve the arge dataset issue ?
@Haydra In my case it turned out the records were eager loading a ton of extra data. Which made the memory fill up. None was needed in my table or export. so i solved it by adding withoutGlobalScopes() to my table query
No there is no eager loading in my scenario
The bulk action i'm building is for attaching records to created record
When i select like 100k record it freeze, i read that's because of the memory
Any familiar topic or idea for solving this issue?