C
C#8mo ago
shanto

Performance degradation during Exporting Data

Hi, Hope you guys are doing well. So, I need help. I have an application that has suppose 10000 of data. My business requirement is to download 1000 documents in excel and word. What the client side does is : Suppose an author has 1500 books So,to export it Download 1 - 1000 (call api with page number 1 and pagesize 1000) Download 1001 - 1500 (call api with page number 2 and pagesize 500) *Note - users can download with one criteria above once i.e. One api call. So, what i was doing is to call the database with 1000 pagesize in repository and then bind in service and then send it to export api that will export those data as excel / word. But it is taking too much time. There are lakhs of users that use our application daily. So, it is a great degradation in performance. What can i do? And yes,those generated data contains image, css,tables too and i am using openxml sdk. Is there any suggestion of improvement. And yes i am using memory stream and returning those data as byte array.
5 Replies
Angius
Angius8mo ago
Do people really just casually download 1500 files generated on the fly? are all the files strictly unique If not, I would certainly cache what I can in a CDN of any sort
Unknown User
Unknown User8mo ago
Message Not Public
Sign In & Join Server To View
shanto
shantoOP8mo ago
Every data is unique and i did mistake while writing. My purpose is to download in a single file ( xlsx / docx) that will contain thousands of data that comes from api. My current document size with writing thousands data that contains image too is 6-7 MB and it takes 1-2 min for overall process (calling api,write to excel and export). Thanks for the response. But my requirement is little bit different. Kindly check above message. I did a mistake while writing. Sorry about that
Unknown User
Unknown User8mo ago
Message Not Public
Sign In & Join Server To View
shanto
shantoOP8mo ago
Fetching 1000 documents from api takes almost 30sec, though in client side we use pagination of 50 pagesize that comes within 300ms.Database is MongoDB and index is used too. And i get image url from json that is stored in gcp. So while writing it into word i needed to download it and blipfill to its position. Say for 1000 data,there is 1000 image and all are inserted in a table row and other informations are also been stored. There are 8 columns in table. And i didn't profile my code
Want results from more Discord servers?
Add your server