C
C#4mo ago
M B V R K

Cache response time issue

Hi friends, I really don't know where to start :( I'm working on a project that implemented using Microservices, one of those services called ElectionService, this service is composed from 3 main projects ( ElectionService.Infrastructure, ElectionService.CQRS and ElectionService.API) The ElectionService.CQRS is the Application layer for this service, the ElectionService.API is an ASP.Net core 8 web api that represents the entry point for this service. As you will noticed the ElectionService.CQRS (Application layer) project is implemented using CQRS pattern. Recently I tried to add the caching mechanism to this service, so I used Distributed SqlServer Cache and this is the first time I deal with it.
18 Replies
M B V R K
M B V R KOP4mo ago
Example: I have this query handler:
public class GetCandidatesQueryHandler : BaseQueryHandler<GetCandidatesQuery, GetCandidatesQueryResult>
{
public GetCandidatesQueryHandler(IMapper mapper, IMediator mediator, AppDbContext dbContext, IDistributedCache distributedCache) : base(mapper, mediator, dbContext, distributedCache)
{
}

public override async Task<GetCandidatesQueryResult> Handle(GetCandidatesQuery query, CancellationToken cancellationToken)
{
try
{
var candidates = query.UseCacheIfAvailable && await _distributedCache.GetStringAsync(query.CacheKey, cancellationToken) is string cache ?
JsonSerializer.Deserialize<IEnumerable<GetCandidatesQueryResultDto>>(cache) :
_mapper.Map<IEnumerable<GetCandidatesQueryResultDto>>(await _dbContext.Candidates.ToListAsync(cancellationToken));

var queryResultDto = _mapper.Map<IEnumerable<GetCandidatesQueryResultDto>>(candidates);
var queryResult = GetCandidatesQueryResult.Succeeded(queryResultDto);

if (!query.UseCacheIfAvailable) return queryResult;

_mediator.Send(new SetQueryCacheEntry(query.CacheKey, queryResult.Value));

return queryResult;

}
catch (Exception ex)
{
return GetCandidatesQueryResult.Failed(ex);
}
}
}
public class GetCandidatesQueryHandler : BaseQueryHandler<GetCandidatesQuery, GetCandidatesQueryResult>
{
public GetCandidatesQueryHandler(IMapper mapper, IMediator mediator, AppDbContext dbContext, IDistributedCache distributedCache) : base(mapper, mediator, dbContext, distributedCache)
{
}

public override async Task<GetCandidatesQueryResult> Handle(GetCandidatesQuery query, CancellationToken cancellationToken)
{
try
{
var candidates = query.UseCacheIfAvailable && await _distributedCache.GetStringAsync(query.CacheKey, cancellationToken) is string cache ?
JsonSerializer.Deserialize<IEnumerable<GetCandidatesQueryResultDto>>(cache) :
_mapper.Map<IEnumerable<GetCandidatesQueryResultDto>>(await _dbContext.Candidates.ToListAsync(cancellationToken));

var queryResultDto = _mapper.Map<IEnumerable<GetCandidatesQueryResultDto>>(candidates);
var queryResult = GetCandidatesQueryResult.Succeeded(queryResultDto);

if (!query.UseCacheIfAvailable) return queryResult;

_mediator.Send(new SetQueryCacheEntry(query.CacheKey, queryResult.Value));

return queryResult;

}
catch (Exception ex)
{
return GetCandidatesQueryResult.Failed(ex);
}
}
}
The issue: The issue is when there is a cache available the application took a lot of time to get the data from the cache, but when there is no cache or the cache is expired (means the app should now select the data from the database then cache it) it took very less time. I test on that Candidates table which contains 10000 record, when data comes from the database it takes 3 seconds but when it comes from the cache it takes 30 seconds. (I used postman and StopWatch for calculating the time elapsed) Please any help for this issue ?????
Saber
Saber4mo ago
whats the point of caching that
canton7
canton74mo ago
Have you looked to see why it's taking so long? Chances are it's in the distributedCache or mediator maybe?
M B V R K
M B V R KOP4mo ago
After many diagnostics, mostly the issue because of the Deserialization, deserializing and mapping 10000 records take so much time, so at this moment the ideal solution is to not cache that amount of items (10000) bellow one key, At this moment the good solution I found is to cache by Pagination (means caching each page separated with its own key), that solved my issue But please if there is an advice I'm here to get learned from your experiences Massive thanks again <3
mtreit
mtreit4mo ago
How big is the data roughly in bytes? 30 seconds sounds frankly ridiculous. @. M B V R K I can JSON deserialize a million (small) records (about 65MB worth of data) in about 1.6 seconds just as a rough point of comparison. That's 625,000 per-second. You're saying your solution is doing a little over 300 per-second. So 3 milliseconds per-item? Either your data is enormous or you are running the code on a potato...
M B V R K
M B V R KOP4mo ago
its 22mb Maybe you are right about the potato, because currently I'm working on an I5 2nd generation with 14gb RAM
mtreit
mtreit4mo ago
14MB? I don't believe you. That's 1980s level of ram
M B V R K
M B V R KOP4mo ago
lol sorry I meant 14GB :kekw:
mtreit
mtreit4mo ago
14GB is plenty
M B V R K
M B V R KOP4mo ago
I heard before that there is a performance difference between Newtonsoft.Json and System.Text.Json, where Newtonsoft.Json has a good performance, but in my case I used System.Text.Json
mtreit
mtreit4mo ago
That's the opposite. Newtonsoft is much worse for performance.
M B V R K
M B V R KOP4mo ago
aaaah From your experience, is there any other considerations or causes for this issue ?
mtreit
mtreit4mo ago
How big is the data in bytes roughly?
M B V R K
M B V R KOP4mo ago
22mb only one table
canton7
canton74mo ago
$close
MODiX
MODiX4mo ago
If you have no further questions, please use /close to mark the forum thread as answered
Gokul
Gokul4mo ago
i have faced a similar kind of issue where we where saving whole table data which later on we would fetch and process it before returning as response, best thing to do is not store list of huge objects in any caching later, and instead of saving entire tables in cache understand the data and how it could be restructured in a way it reduces the key size. i would love to help you come up with a solution , if its a personal project i can take a look at it in github i have worked with redis cache and the general rule of thumb is if your cache key hold data more than 2mb then something is wrong with current approach, ik new caching software can handle large set of data, but remember as per you example you are reading a table and creating a key, which i hope can and will get bigger as a result storing table record as a whole in single key is not optimal, similarly storying each row in single key is also not optimal because once the number of keys get larger, pattern based searching will take time since it has to scan a large set of keys. best way to approach this problem is the identify your data need , can you spit the data into reasonable subsets, it could be grouping them based on a type (column) etc.
Gokul
Gokul4mo ago
Check this article on how they restructured the data to save memory , it applies for all caching software : https://instagram-engineering.com/storing-hundreds-of-millions-of-simple-key-value-pairs-in-redis-1091ae80f74c
Medium
Storing hundreds of millions of simple key-value pairs in Redis
When transitioning systems, sometimes you have to build a little scaffolding. At Instagram, we recently had to do just that: for legacy…
Want results from more Discord servers?
Add your server