C
C#11mo ago
VicXaulim

Minification or compression of large API returns

I need some way to compress responses from an API that needs to return massive volumes of data. I know about Gzip, but is there any way to compress the response or encode it to a less cumbersome value? Note: My API returns JSON
18 Replies
cap5lut
cap5lut11mo ago
it depends on the consumers of the api. u could theoretically use any compression algorithm u want. but if the user of the api cant decompress it its quite useless. so it highly depends on the user, if u expect it to be consumable by web front ends, u would have to check/write compatible decompression libraries in js/wasm or similar. if it is just for ur own custom client application, u have a free hand. but usually the rule of thumb is: whatever the browsers support i interpreted API here as web api, btw
VicXaulim
VicXaulimOP11mo ago
yes, web api
cap5lut
cap5lut11mo ago
but in the end: minify (its also less load for ur server) and then pick a compression thats supported by HTTP (https://en.wikipedia.org/wiki/HTTP_compression) and widely used (especially the Content-Encoding tokens section is of interest there)
VicXaulim
VicXaulimOP11mo ago
interesting, I will use it but I need to provide context
cap5lut
cap5lut11mo ago
but also check for browser support imagine u enter some https://example.org/some/file.json and u receive a binary file
VicXaulim
VicXaulimOP11mo ago
the problem is that this API will be responsible for accessing databases with millions of records. So, my concern is with performance. The bank access part is OK, my concern is with the volume that I need to return in the response. Do you believe that using a compression algorithm can do the trick?
cap5lut
cap5lut11mo ago
well, who/what is consuming that api endpoint?
VicXaulim
VicXaulimOP11mo ago
mostly desktop applications
cap5lut
cap5lut11mo ago
so some end user with their network, why are u transfering millions of records? oO
VicXaulim
VicXaulimOP11mo ago
because the machines on which they will be installed do not have direct access to the network database so, someone had the idea of ​​making a bridge through an API. Basically, it is configured on the IIS server of the client that has access to the database and the application consumes this API I'm refactoring the API to make it faster dunno if I made myself clear
cap5lut
cap5lut11mo ago
yeah, but usually u have pagination and stuff, so u dont transfer that massive amount of records if u really have to transfer that much, i would probably first of all stream the data via websocket in binary format and then look for a way how to compress that in some chunked manner
VicXaulim
VicXaulimOP11mo ago
how to do this? pagination is an available option in some cases, unfortunately in others it is not. But I will definitely implement
cap5lut
cap5lut11mo ago
what do refer to here?
VicXaulim
VicXaulimOP11mo ago
stream the data via websocket in binary
cap5lut
cap5lut11mo ago
well, come up with a binary format, encode the rows in that format and send them via websocket actually, that doesnt even need to be websockets, can be a plain http endpoint as well 🤔 the point is, json is quite fat, its textual, it contains property names, etc. a pure binary format can surely reduce the size, but u will have to decode it on the receiving end
VicXaulim
VicXaulimOP11mo ago
understood that was my point, bypassing the weight of JSON
cap5lut
cap5lut11mo ago
the main thing here is probably for one the binary transmission, and for the other that u can start processing the data, while its still loading the rest i dont have much experience with compression but there surely are differences between "compress it all and spit it out afterwards" or if u need streaming
Joschi
Joschi11mo ago
Response compression in ASP.NET Core
Learn about response compression and how to use Response Compression Middleware in ASP.NET Core apps.
Want results from more Discord servers?
Add your server