Minification or compression of large API returns
I need some way to compress responses from an API that needs to return massive volumes of data. I know about Gzip, but is there any way to compress the response or encode it to a less cumbersome value? Note: My API returns JSON
18 Replies
it depends on the consumers of the api. u could theoretically use any compression algorithm u want. but if the user of the api cant decompress it its quite useless.
so it highly depends on the user, if u expect it to be consumable by web front ends, u would have to check/write compatible decompression libraries in js/wasm or similar.
if it is just for ur own custom client application, u have a free hand.
but usually the rule of thumb is: whatever the browsers support
i interpreted API here as web api, btw
yes, web api
but in the end: minify (its also less load for ur server) and then pick a compression thats supported by HTTP (https://en.wikipedia.org/wiki/HTTP_compression) and widely used
(especially the
Content-Encoding tokens
section is of interest there)interesting, I will use it
but I need to provide context
but also check for browser support
imagine u enter some
https://example.org/some/file.json
and u receive a binary filethe problem is that this API will be responsible for accessing databases with millions of records. So, my concern is with performance. The bank access part is OK, my concern is with the volume that I need to return in the response. Do you believe that using a compression algorithm can do the trick?
well, who/what is consuming that api endpoint?
mostly desktop applications
so some end user with their network, why are u transfering millions of records? oO
because the machines on which they will be installed do not have direct access to the network database
so, someone had the idea of making a bridge through an API. Basically, it is configured on the IIS server of the client that has access to the database and the application consumes this API
I'm refactoring the API to make it faster
dunno if I made myself clear
yeah, but usually u have pagination and stuff, so u dont transfer that massive amount of records
if u really have to transfer that much, i would probably first of all stream the data via websocket in binary format and then look for a way how to compress that in some chunked manner
how to do this?
pagination is an available option in some cases, unfortunately in others it is not. But I will definitely implement
what do refer to here?
stream the data via websocket
in binary
well, come up with a binary format, encode the rows in that format and send them via websocket
actually, that doesnt even need to be websockets, can be a plain http endpoint as well 🤔
the point is, json is quite fat, its textual, it contains property names, etc. a pure binary format can surely reduce the size, but u will have to decode it on the receiving end
understood
that was my point, bypassing the weight of JSON
the main thing here is probably for one the binary transmission, and for the other that u can start processing the data, while its still loading the rest
i dont have much experience with compression but there surely are differences between "compress it all and spit it out afterwards" or if u need streaming
There is also some docs by MS https://learn.microsoft.com/en-us/aspnet/core/performance/response-compression?view=aspnetcore-8.0 maybe that helps
Response compression in ASP.NET Core
Learn about response compression and how to use Response Compression Middleware in ASP.NET Core apps.