Fetch all records from endpoint works but it's very slow.

I have a call to an api where I need to fetch all records under for a certain object and api number of records is about 39,000 records. The api has a limit of 1000 for pagination (I can't set the limit to 39,000 in one go). So I have to batch the calls to the api. Currently, the code I have is working and I'm able to pull all 39,000 records but this call is taking around 30 secs to a 1 min to finish. I was wondering how can I refactor the code below to 1) make it cleaner 2) more performant (In the ms or 1 territory versus 30s to 1min). Any advice here?
38 Replies
JavaBot
JavaBot2y ago
This post has been reserved for your question.
Hey @Kale Vivi! Please use /close or the Close Post button above when you're finished. Please remember to follow the help guidelines. This post will be automatically closed after 300 minutes of inactivity.
TIP: Narrow down your issue to simple and precise questions to maximize the chance that others will reply in here.
Kale Vivi
Kale ViviOP2y ago
private SomeApiResponse getFromApi(int offset) {
int limit = 1000;
String someUrl = fromUriString(someApiUri)
.queryParam("q", "*")
.queryParam("offset", offset)
.queryParam("limit", limit)
.build()
.toString();

final HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
headers.add("x-apigw-api-id", gatewayId);

ResponseEntity<Object> responseEntity = oAuth2RestTemplate.exchange(
someUrl,
HttpMethod.GET,
new HttpEntity<>(headers),
new ParameterizedTypeReference<>() {
});

final Map<String, SomeApiResponse> result = objectMapper.convertValue(responseEntity.getBody(), new TypeReference<>() {});
return result.values().stream().findFirst().get();
}

private SomeApiResponse fetchSomeInfo() {
SomeApiResponse response = getFromApi(1);

int noOfRecords = response.getNoOfRecords();
List<SomeResponse> someInfo = response.getSomeInfo();

int offset = response.getParams().getOffset();
if (noOfRecords > response.getParams().getLimit()) {
while (offset < noOfRecords) {
offset = (offset + response.getParams().getLimit() - 1) + 1;
someInfo.addAll(getFromApi(offset).getSomeInfo());
}
}
return response;
}
private SomeApiResponse getFromApi(int offset) {
int limit = 1000;
String someUrl = fromUriString(someApiUri)
.queryParam("q", "*")
.queryParam("offset", offset)
.queryParam("limit", limit)
.build()
.toString();

final HttpHeaders headers = new HttpHeaders();
headers.setContentType(MediaType.APPLICATION_JSON);
headers.add("x-apigw-api-id", gatewayId);

ResponseEntity<Object> responseEntity = oAuth2RestTemplate.exchange(
someUrl,
HttpMethod.GET,
new HttpEntity<>(headers),
new ParameterizedTypeReference<>() {
});

final Map<String, SomeApiResponse> result = objectMapper.convertValue(responseEntity.getBody(), new TypeReference<>() {});
return result.values().stream().findFirst().get();
}

private SomeApiResponse fetchSomeInfo() {
SomeApiResponse response = getFromApi(1);

int noOfRecords = response.getNoOfRecords();
List<SomeResponse> someInfo = response.getSomeInfo();

int offset = response.getParams().getOffset();
if (noOfRecords > response.getParams().getLimit()) {
while (offset < noOfRecords) {
offset = (offset + response.getParams().getLimit() - 1) + 1;
someInfo.addAll(getFromApi(offset).getSomeInfo());
}
}
return response;
}
Crain
Crain2y ago
The only way you're really going to get that is to cache the results. If you don't have control of the database or the underlying API you can only optimize your caching solution. I'd also recommend looking at your use case. To say you need all 39,000 records from someone else's API quickly is a sign of an issue with your use case. Assuming they aren't load balancing as well you can't easily multithread the requests, since you're just asking to lock their entire table down, it might ironically be slower to async the requests.
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Kale Vivi
Kale ViviOP2y ago
What do you mean by 1m to fetch 39k is pretty hefty? Is that a bad thing or do you mean thats' good?
Crain
Crain2y ago
Bad thing Bad thing it takes a minute, bad thing you need 39k records
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Kale Vivi
Kale ViviOP2y ago
Even worst thing is after the 39k records, I need to apply a filter to get it down to 500
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Crain
Crain2y ago
Wtf
Kale Vivi
Kale ViviOP2y ago
to display in a picklist @_@
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Crain
Crain2y ago
Does the endpoint/API not have a search function?
Kale Vivi
Kale ViviOP2y ago
I don't know how to do that
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Kale Vivi
Kale ViviOP2y ago
Basically the trouble I have is do something like this query where someCode is distinct but how do I filter on distinct query param? idk how to do that and I'm not sure the api allows that?
Crain
Crain2y ago
Check their Docs
Kale Vivi
Kale ViviOP2y ago
I did but nothing in the docs Should I recommend it to them?
Crain
Crain2y ago
Yes. How often is your query called, how often is the underlying API changing, and are you able to locally cache information?
Kale Vivi
Kale ViviOP2y ago
I can technically add the records to a db table if I want to, as I don;t think it'll change often Is it a good idea to add it to my table instead
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Kale Vivi
Kale ViviOP2y ago
and fetch from the table after the first load
Crain
Crain2y ago
You only want the distinct codes right?
Kale Vivi
Kale ViviOP2y ago
yes
Crain
Crain2y ago
Nothing from your endpoint can cause that to change?
Kale Vivi
Kale ViviOP2y ago
what do you mean by that?
Crain
Crain2y ago
Someone can't apply another filter beyond that in your endpoint
Kale Vivi
Kale ViviOP2y ago
no
Crain
Crain2y ago
Are you in Spring?
Kale Vivi
Kale ViviOP2y ago
yup
Crain
Crain2y ago
https://www.baeldung.com/spring-cache-tutorial Use this, apply it to your SERVICE LAYER. I'd configure it to wipe at midnight, and then try to get it automatically run the Service function after the application starts, and after the Cache gets evicted
Baeldung
A Guide To Caching in Spring | Baeldung
How to enable, configure and make good use of the Caching Abstraction in Spring.
Crain
Crain2y ago
Your end users will never notice it takes a minute to do the call, you'll only cache the 500~ records you care about, and since it's in the Service layer you can automatically run it to populate the cache early.
Kale Vivi
Kale ViviOP2y ago
Okay thanks. I'm taking a look and reading the doc
JavaBot
JavaBot2y ago
If you are finished with your post, please close it. If you are not, please ignore this message. Note that you will not be able to send further messages here after this post have been closed but you will be able to create new posts.
Kale Vivi
Kale ViviOP2y ago
Should I just use @Cacheable here then instead of the others? CacheEvict seems interesting but how does it know which data is stale...? Is it the one that has changed or..?
Crain
Crain2y ago
Yeah @Cacheable is what you want You could create a @CacheEvict locked behind some authorization if you think you might want to manually clear the cache, but that doesn't seem to be the case here.
Kale Vivi
Kale ViviOP2y ago
ok thanks
JavaBot
JavaBot2y ago
If you are finished with your post, please close it. If you are not, please ignore this message. Note that you will not be able to send further messages here after this post have been closed but you will be able to create new posts. 💤 Post marked as dormant
This post has been inactive for over 300 minutes, thus, it has been archived. If your question was not answered yet, feel free to re-open this post or create a new one.
Want results from more Discord servers?
Add your server