Unable to load tfjs ML model

Hello guys, I'm having a bit of a problem with trying to get tensorflow.js working on cloudflare workers. When loading tf.loadLayersModel() it starts 45 concurrent connections to read all my binary files which are currently stored on r2. This causes "Error: Response closed due to connection limit". I've tried to use the R2 env variable but the function requires a url to be passed into it so it can load the main file (model.json) and the dozens of other binary files contained in the same folder. I've attempted to read them 1 by 1 but at that point the worker timed out barely getting past 35. I've attempted to store them into KV but they are binary files and it seems like i wasnt able to insert them into KV as is. I was thinking of converting the binary into a base64 string to be able to store but that just added another layer of processing which i have no cpu time left for. TLDR Im looking for the concurrent connection limit for r2 to read multiple files in the same folder at the same time, and I'm wondering if its possible to increase the limit somehow or limit it to a couple of concurrent at a time but i have no idea what the limit is. I'm sure combining them would fix the issue but I'm scared I would spend hours on it and get no return after hitting another wall. Is there some other worker function? This is the test code I'm currently using:
import * as tf from '@tensorflow/tfjs';

async function loadModel() {
const model = await tf.loadLayersModel('https://custom(r2).domain/models/tfjs_model/model.json');
return model;
}

addEventListener('fetch', (event) => {
event.respondWith(handleRequest(event));
});

async function handleRequest(event) {
try {
const model = await loadModel();
console.log('Model loaded successfully');
return new Response('Model loaded successfully', { status: 200 });
} catch (err) {
console.error('An error occurred:', err);
return new Response('An error occurred:'+err, { status: 500 });
}
}
import * as tf from '@tensorflow/tfjs';

async function loadModel() {
const model = await tf.loadLayersModel('https://custom(r2).domain/models/tfjs_model/model.json');
return model;
}

addEventListener('fetch', (event) => {
event.respondWith(handleRequest(event));
});

async function handleRequest(event) {
try {
const model = await loadModel();
console.log('Model loaded successfully');
return new Response('Model loaded successfully', { status: 200 });
} catch (err) {
console.error('An error occurred:', err);
return new Response('An error occurred:'+err, { status: 500 });
}
}
0 Replies
No replies yetBe the first to reply to this messageJoin
Want results from more Discord servers?
Add your server