Whats your use case? Bit more context?

Whats your use case? Bit more context?
44 Replies
1984 Ford Laser
1984 Ford LaserOP2w ago
@gruntlord6 jump in here, so arent spamming
gruntlord6
gruntlord62w ago
right, but now I am thinking of a bigger workflow where I spin sub objects to do a task and do seems easier to do that with then regular workers which I guess is why workflows is a product
1984 Ford Laser
1984 Ford LaserOP2w ago
yeah Workflows is built on DOs
gruntlord6
gruntlord62w ago
so is everything apprently
1984 Ford Laser
1984 Ford LaserOP2w ago
Does your app have like users Or something else that is an object defined within your backend
gruntlord6
gruntlord62w ago
this one doesnt, but being single threaded makes it a bit slower
1984 Ford Laser
1984 Ford LaserOP2w ago
like how are you categorising your data For my use case its streams of sports games so each DO instance is a single match with things like scehdule, scores, players etc using an ID to call each match's DO from the matches namespace
gruntlord6
gruntlord62w ago
theres a list it has of named data sets, it gets each individual data set then pools it together and sends some formatted data to r2, then packaged everything from the local sql storage to d1
1984 Ford Laser
1984 Ford LaserOP2w ago
If each dataset follows a similar format them maybe a DO per dataset? How big is each dataset
gruntlord6
gruntlord62w ago
it works fine now but one of the lists is 189 sets so takes a while in the current setup, was considering having a parent process just spin a DO for each set, then having the aprent consolidate
1984 Ford Laser
1984 Ford LaserOP2w ago
yeah I do simuilar
gruntlord6
gruntlord62w ago
they vary a lot
1984 Ford Laser
1984 Ford LaserOP2w ago
I have another namespace for each league which contains a list of IDs which are references to the match DO
gruntlord6
gruntlord62w ago
I did it for a different data category and it was way smaller, only 90 lists
1984 Ford Laser
1984 Ford LaserOP2w ago
or could have KV that stores a parent reference to the dataset IDs
gruntlord6
gruntlord62w ago
theres a seperate product I have put off working on where I want a master data set and each user gets a dedicated database for thier data I was even thinking crazy, that each object could be its own DO since they tend to contain a huge amount of product data and meta data, which would be part of the customer DO which belongs to the single parent DO that orchestrates the whole rabbit hole but I figure this would be an easier starting point then that mess this one I already made was actually large enough that I ran out of memory dumping to R2 and had to reprogram that part
1984 Ford Laser
1984 Ford LaserOP2w ago
This is the perfect way to split them up Like essentially as expected
gruntlord6
gruntlord62w ago
just seems a bit over the top, but I also read the web sockets thing and figured that each DO could be updated in real time if users were changing the data so I can make this pretty complex vs what my original plan was, just 1 db per user
1984 Ford Laser
1984 Ford LaserOP2w ago
Yeah youre thinking in DO terms now lol
gruntlord6
gruntlord62w ago
which is still pretty solid and probably fine for what most people would do, but I figure may be worth the extra work
1984 Ford Laser
1984 Ford LaserOP2w ago
This is all standard procedure
gruntlord6
gruntlord62w ago
idk seems like a lot haha
1984 Ford Laser
1984 Ford LaserOP2w ago
So much flexibility though
gruntlord6
gruntlord62w ago
these cloudflare guys just try to tempt me with all these fancy use cases
1984 Ford Laser
1984 Ford LaserOP2w ago
man I dont have enough time to write all the things I want in DOs
gruntlord6
gruntlord62w ago
well I guess I'll try this crazy concurrency thing tomorrow and see how 189 instances of storing this data works out lol
1984 Ford Laser
1984 Ford LaserOP2w ago
I do similar with sets of 80-120 instances at a time. Have like 25 sets like that
gruntlord6
gruntlord62w ago
if that goes well I'll consider trying to make my complicated database plan
1984 Ford Laser
1984 Ford LaserOP2w ago
My main update code through these is super inefficient cos I do a bunch of heavy work in DO instead of in the calling worker
gruntlord6
gruntlord62w ago
well the worse part is I probably have like 10 or so more lists of data sets to architect and they run daily these were just the most important ones
1984 Ford Laser
1984 Ford LaserOP2w ago
You could do reads of the written data in DO, then compare values to what you are updating with Only update whats necessary maybe obvs that might not work for your use case
gruntlord6
gruntlord62w ago
im technically reading duplicate data atm but its not enough extra duolication for me to really care I should be just logging the changes on the base set no it does, I just have bigger fish to fry only so many things I can make and test in a day
1984 Ford Laser
1984 Ford LaserOP2w ago
now you see why peeps love working with em hey If your use case is monetised correctly the runtime usage costs are essentially a rounding error And you dont do stupid high numbers of storage writes
gruntlord6
gruntlord62w ago
the very fact I spent an entire day taking something that worked and making it a DO was already dumb but I wanted the incresed resource effienct vs a simple cron even without the DO I think I still fall under the $5 minimum based on my napkin math
1984 Ford Laser
1984 Ford LaserOP2w ago
just the thought exercise on working on a DO and getting it is very useful
gruntlord6
gruntlord62w ago
for this anyway
1984 Ford Laser
1984 Ford LaserOP2w ago
It takes a bit to wrap the head around the different architecture paradigm
gruntlord6
gruntlord62w ago
yea it was a lot of extra work, especially given the sql differences and the logic gate thing input gate I think its a lot more fleshed out though it just makes me want to make more things when I already have things to make lol
1984 Ford Laser
1984 Ford LaserOP2w ago
Yeah but more efficient stuff hahaha
gruntlord6
gruntlord62w ago
not to write XD
1984 Ford Laser
1984 Ford LaserOP2w ago
just the ease of scalability with this stuff is great and cos Workers are super extensible you can do a bunch of cool stuff
gruntlord6
gruntlord62w ago
this actually replaced the backend I was running on a vps so it was a nice win
1984 Ford Laser
1984 Ford LaserOP2w ago
I have a graphics automation pipeline that can turn a photoshop file into a json, with editable values being easily automated, then render into a PNG, all through Workers using DOs for easy/efficient handling of data in/out, deferring to Workers when doing larger tasks
gruntlord6
gruntlord62w ago
I was already using images to convert things to R2 in a front end and a worker with D1 for a pages application to handle download count, this was initially just a thought exercise to see if I could take a complicated long running task and run in on an worker

Did you find this page helpful?