❔ ASP.NET Core Persist Task between requests

I am programming an API and one of my endpoints needs to do a database operation, but it will take a long time, and the result is not actually needed until a different endpoint is called. I would like to queue the Task on the first endpoint and return a 202 Accepted, then only have to await the task (if it has not already completed) on the other endpoint. I’m not sure what the best way to do this would be. I could have a static dictionary of tasks keyed by users stored in a service, but this seems naive. I’ve heard about background workers but this is an on-demand thing so I’m not sure if that’s appropriate either as I get the feeling they’re for cron-type things. Any advice would be appreciated.
95 Replies
dreadfullydistinct
This looks sort of like what I want https://learn.microsoft.com/en-us/aspnet/core/fundamentals/host/hosted-services?view=aspnetcore-7.0&tabs=visual-studio#queued-background-tasks so I will try it later but I was wondering if anyone knew offhand what the best approach would be. Also with this approach is it possible to cache the results of the tasks for, say, 5 minutes in case the second endpoint is called again
Background tasks with hosted services in ASP.NET Core
Learn how to implement background tasks with hosted services in ASP.NET Core.
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Yeah it isn’t very rest haha. To be more specific I have an endpoint /foo which is called before /bar. /bar has to do a load of database work and it is sometimes taking so long that there’s a timeout after 15 seconds. So I was thinking of starting the db work in the background at /foo and because the client waits a few seconds before calling /bar, hopefully reduce the amount of time processing /bar takes
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Not really, it’s loading a user’s save for a game thing. So it’s personalised to each user
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
I’ve tried optimising the db query but I wasn’t really sure where to start
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Yeah across a few tables S3?
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Well the save is like Encoded in the database schema It’s not a single file as such
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Hmm Just makes it easier to work with I guess
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
It’s like one user has many characters so when we load the save we collect all the characters with their user id
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
It is all just text But there’s quite a bit of it Few hundred entities maybe When it gets sent as json it’s about 2MB
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
It’s primary keyed by the account id And then some have a composite key So for characters you can only have one character of each type per person
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
So the composite key is account id and character is
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Uh not now I’m on my morning commute 😛 But yeah it does end up being multiple queries. I’m using EF and I tried doing a join via nav properties but there are a lot of one to many relationships so it timed out and told me to do split queries anyway
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Do you want the EF core statement or the raw sql it compiles to I imagine the latter
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Yup as I’ve found out trying to optimise it Just looked up index properties though and I should probably make the account id an index for all those composite key tables
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Postgres
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Actually it does if I foreign key them Which I was trying to do to speed this up So maybe I’d better do that as well
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
It’s all a bit messy and too many layers but I can link you the actual GitHub of the problem controller
dreadfullydistinct
Those repository methods are all basically a .Where(x => x.deviceaccountid == deviceaccountid)
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Lol I was just copying what we do at work Mainly because it’s much easier to test with mocking But I have heard it isn’t recommended Should I inject the context directly
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Uh well isn’t the repository basically a service with the context injected
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
They’re all scoped
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Hm I’ll keep that in mind, it’s not likely to be massively reducing my performance though right
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
It’s not mega, in terms of raw storage space the prod environment is using less than 100MB last I checked But the characters and dragons and builds could be a couple hundred entities each
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
It’s not 10 minutes the worst end was like 15 seconds
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Not really Characters is less than dragons because you can have multi of the same dragon And builds is probably about constant per account
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
The thing is sometimes it’s sub 1 second, ef core can be weird Yeah I think account id indices will improve it
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
So this idea about preloading it is probably barking up the wrong tree You think? I have some commits locally to make all these disparate tables all foreign key into a central table containing just the account ids So that should add some indices and speed things up hopefully
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
That’s a good idea
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Is a stopwatch good for that Well I find the stats highly misleading It always says like 3ms each in the logging Lmao Probably because compiling the query takes the longest I don’t know enough about the inner workings
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Anyway I gotta get to work cheers for the advice
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
khamas
khamas2y ago
you can just have a concurrent dict of all the results that you then lookup in the second endpoint
Tvde1
Tvde12y ago
pfft that's fine you could actually say I am a fan of it as long, of course, as the repository is truly a repository and does not leak EF queryables for example you should be able to switch it with a completely different implementation like dapper or manual SQL queries that way you can truly do onion and your business logic will not depend on entity framework but if your repository returns IQueryables, then there is no use to it, you're just hindering yourself
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
Tvde1
Tvde12y ago
yes so then it's not really repository because repositories must abstract the concrete implementation you can't even leak entities that will be modified
dreadfullydistinct
My purpose in doing it is so that I can test the query methods. Mocking a db set suuucks But if I have the queries in a separate layer I can do an integration test with an inmemory db Could make it return concrete types instead of queryables I guess but again I was just copying what we do at work lol
Tvde1
Tvde12y ago
A lot of people come up with this, and I had to remove it from two projects at work now we just use the context directly. most of the code is the same, it's just a lot less boilerplate interfaces and classes that immediately return a db set
dreadfullydistinct
I guess my concern is when you have the EF core auto update stuff e.g. var entity = context.find(key); entity.property = 5 Without using an in memory db how can you check that the property is set to 5 Currently I use mock repositories with moq and verify So I can verify the repository’s update method is called with an argument of 5
Tvde1
Tvde12y ago
good luck mocking IQueryable 😬
dreadfullydistinct
.AsQueryable().BuildMock() works fine It’s another package though
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
But if dbcontexts were easier to mock then Maybe I would just do that Not for this project it’s too much work
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Yeah a lot of the unit tests feel a bit pointless Still when the services have complicated logic it’s nice to be able to mock the data coming from the db But I suppose you can just as easily use inmemory
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
SQLite? I use SQLite for the like Ones closer to end to end where you’re actually calling an endpoint But for the more isolated ones I use inmemory Maybe I should switch those over
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Yea actually to be honest running locally there’s no reason they can’t use a real db I just didn’t want actual data from manual testing to contaminate it Maybe I could spin up another one using docker cli?
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
But then how does it work on GitHub actions
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Containers within containers
Tvde1
Tvde12y ago
you can run containers on github actions too
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
I’m sure it can be done yea Does the same go for replacing Redis with inmemory distributed cache on another note I wouldn’t think so as I’m not using rejson or anything
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Well I’m not losing sleep over that one because it’s all idistributedcache
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
So if they all satisfy the same interface it should be fine
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
And if it’s not then it probably isn’t my fault
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
Eh true
Unknown User
Unknown User2y ago
Message Not Public
Sign In & Join Server To View
dreadfullydistinct
I feel like the distributedcache is much more straightforward than a relational db though > less room for implementation specific behaviour But if I go through all the work to spin up containers for the tests it may be worth switching anyway I agree it should be as close as possible
Accord
Accord2y ago
Was this issue resolved? If so, run /close - otherwise I will mark this as stale and this post will be archived until there is new activity.