Expected latency of cache accesses

Hi! I've been messing around with Workers for a bit for a research project, and have been using the cache API. All the worker does a put or match to the cache depending on the pathname of the url, using these functions:
async function putter(key, value) {
const start = performance.now()
await testCache.put(key, value.clone());
const end = performance.now()
const time = end-start
return [value, time];
}

async function getter(key) {
const start = performance.now();
const cacheValue = await testCache.match(key);
const end = performance.now();
const time = end-start;
return [cacheValue, time];
}
async function putter(key, value) {
const start = performance.now()
await testCache.put(key, value.clone());
const end = performance.now()
const time = end-start
return [value, time];
}

async function getter(key) {
const start = performance.now();
const cacheValue = await testCache.match(key);
const end = performance.now();
const time = end-start;
return [cacheValue, time];
}
The worker is deployed under a custom domain. This works exactly how I'd expect: values are stored in the cache, and can be retrieved. However, the performance surprised me; the latencies reported by the above functions average ~5ms for a match request. Is this expected? If not, are there any pointers towards improving performance? Perhaps I configured the cache headers incorrectly?
0 Replies
No replies yetBe the first to reply to this messageJoin
Want results from more Discord servers?
Add your server