How to handle caching with Apollo?
I was looking into this issue. The RCA is that the cache for records is updated (optimistically), but the cache for fetching from the same data source for deleted records with a filter criteria is not updated. Hence the records still show in the deleted list.
The question is, how do I handle this scenario? Do I update cache for the query with filter? This could have many different possible filter values, so updating all of them does not seem ideal. So do I just set that query as invalidated? For that I will need to know all possible filter combinations again?
GitHub
Issues · twentyhq/twenty
Building a modern alternative to Salesforce, powered by the community. - Issues · twentyhq/twenty
5 Replies
Interesting. I wonder if this is the issue that I'm seeing on my end.
cc @charles
@YodelMonster thanks for taking a look at the issue! You are right: we have two levels of optimistic rendering:
- updating the record in the apollo cache
- updating the results of the queries in the apollo cache (them selves pointing to record)
These mechanisms are automatically handled by useFindManyRecords, useCreateOneRecord, etc... We call the second case "OptimisticEffects"
For example, in useCreateManyRecords, you will see a "triggerCreateRecordsOptimisticEffect"
these optimistic effects are basically looking at the mounted queries, checking if the affected records should be added / removing of the query response based on query filters and sorts
so I would say that you need to update the logics located in packages/twenty-front/src/modules/apollo/optimistic-effect
@charles the cache itself I'm not sure how to handle. The issue comes from the handler in
triggerDeleteRecordsOptimisticEffect.ts
where the deleted record is removed from the query cache, but this is not the problem. The problem is that this handler does not invalidate or update the cache for the query that fetches the deleted records. I also don't think this can be reliably handled optimistically, since you can have any filter values (apart from the deleteAt!=null check).@YodelMonster taking into account all the possible filters is exactly what these optimistic effects should be doing. this delete case is actually an "update" case since we are updating the deletedAt. So some code inspiration can be found in: triggerUpdateRecordOptimisticEffect (you'll see code such as updatedRecordMatchesThisRootQueryFilter which checks the filters of the query)
My inputs on the high level approach:
- record delete vs record destroy => we actually have useDestroyOneRecord vs useDeleteOneRecord ==> I think we should rename the triggerDeleteOptimisticEffect to triggerDestroyOptimisticEffect, this will match the previous behavior before we introduce soft delete
- record delete should actually trigger triggerUpdateRecordOptimisticEffect