KnightOfRohan
Question about File Upload Security announcement on May 20th regarding file names and mimtypes.
This is probably a dumb question cuz I'm really new to coding, but there is an #💫┊announcements from May 20th of this year, regarding File Upload security, and the linked docs say that the
acceptedFileTypes()
function "uses Laravel's mimetypes
rule which does not validate the extension of the file, only its mime type, which could be manipulated."
Is this the rule being referred to? https://laravel.com/docs/11.x/validation#rule-mimetypes
And if so, it says the "file's contents will be read and the framework will attempt to guess the MIME type, which may be different from the client's provided MIME type."
When it says it reads the file's contents doesn't that mean that it's not just looking at the mimetype provided by the client/browser?
What's also a little confusing is that right underneath that, there is a mimes
rule ( https://laravel.com/docs/11.x/validation#rule-mimes ), which kinda says the same thing...? The only difference I can tell is that one is comparing a list of mimtypes to a list of mimetypes while the latter is comparing a file extention to the mimetype, but in both cases, it sounds to me like they are actually reading the binary content of the file to get the actual mimetype of the file and not just some property that can be falsified.
Am I misunderstanding what the filament or laravel docs are saying? Any clarification would be greatly appreciated.3 replies
Import action freezes browser window when importing 86k rows of data. How to troubleshoot?
Hello I am trying to understand how filaments import action works so that I may troubleshoot it further.
Could someone point me in the direction of where I could understand the Filament import action's process of importing a CSV file? Like does it load the entire file into memory or does it read it in parts? I know there is the chunking functionality but if I am not mistaken that is only for the part where it is sending chunks of data to be processed in batches for import. I'm more curious about the loading and reading of the file before it is chunked for processing/import because an app I am working on seems to freeze in the browser for larger imports.
I am able to import smaller files that are about 1,000 rows long without issue. However when I attempt to import a larger file that is ~86k rows (8MB) the window freezes. I need to be able to import files that have up to a million or more rows (40+MB)
I have set the max_memory_limit to 2G and the chunk size of the import action to 100. I know it defaults to this but I wanted to be sure.
I am relatively new to PHP, Laravel and filament and I am not sure where to look. I have tried reading through the docs, and unless I am missing something, I don't see anything to either manage or troubleshoot the reading/loading of the file. I see stuff like lifecycle hooks and most of them seem to be related to the lifecycle after the read.
Is there a way to not load the entire CSV file into memory and instead load/read the number of rows specified in chunks?
Any help or insight would be appreciated.
50 replies