Ahz
Ahz
CDCloudflare Developers
Created by sanser on 3/31/2024 in #pages-help
postgres in pages
also those libs dont work w cf workers, no fs access, there are other ways to connect to postgres, like neon
7 replies
CDCloudflare Developers
Created by sanser on 3/31/2024 in #pages-help
postgres in pages
you'll need to connect at the request level, not app level, serverless pages can't maintain a db connect
7 replies
CDCloudflare Developers
Created by rubberburger on 10/2/2024 in #pages-help
Easiest way of persisting function logs
i push logs to a separate service, found axiom to be cheap and easy to integrate
3 replies
CDCloudflare Developers
Created by Murder Chicken on 9/24/2024 in #workers-help
Serialized RPC arguments or return values are limited to 1MiB
for added efficiency, I save the response in memory on the callee side so I don't have to reprocess it, but my RPC callee is a durable object which will maintain the state in memory for a few seconds which is enough for me
17 replies
CDCloudflare Developers
Created by Murder Chicken on 9/24/2024 in #workers-help
Serialized RPC arguments or return values are limited to 1MiB
which in my case, is called on the RPC callee side in the response handler
const response = stream
? createTextStream(finalResults, this.conf.RPC_STREAM_CHUNK_SIZE, this.logger)
: finalResults;

return response;
const response = stream
? createTextStream(finalResults, this.conf.RPC_STREAM_CHUNK_SIZE, this.logger)
: finalResults;

return response;
`
17 replies
CDCloudflare Developers
Created by Murder Chicken on 9/24/2024 in #workers-help
Serialized RPC arguments or return values are limited to 1MiB
on the callee side, I wrap my response in something like this if i want to stream the response
export function createTextStream(
results: any,
chunkSize: number = 1000 * 1000, // 1MiB default
logger: any
): ReadableStream<Uint8Array> {
const text = JSON.stringify(results);
const encoder = new TextEncoder();

return new ReadableStream({
start(controller) {
let offset = 0;
while (offset < text.length) {
const chunk = text.slice(offset, offset + chunkSize);
controller.enqueue(encoder.encode(chunk)); // Encode the text chunk
offset += chunkSize;
// logger.debug('enqueued', { offset });
}
// logger.debug('closing stream');
controller.close();
// logger.debug('closed stream');
}
});
}
export function createTextStream(
results: any,
chunkSize: number = 1000 * 1000, // 1MiB default
logger: any
): ReadableStream<Uint8Array> {
const text = JSON.stringify(results);
const encoder = new TextEncoder();

return new ReadableStream({
start(controller) {
let offset = 0;
while (offset < text.length) {
const chunk = text.slice(offset, offset + chunkSize);
controller.enqueue(encoder.encode(chunk)); // Encode the text chunk
offset += chunkSize;
// logger.debug('enqueued', { offset });
}
// logger.debug('closing stream');
controller.close();
// logger.debug('closed stream');
}
});
}
17 replies
CDCloudflare Developers
Created by Murder Chicken on 9/24/2024 in #workers-help
Serialized RPC arguments or return values are limited to 1MiB
My code isn't suited to post here for the whole thing, but this is the key part which handles the stream processing on the caller side. Note the closure function I pass in is the actual RPC call.
export async function withStreamProcessing(
operation: (streaming: boolean) => Promise<unknown | ReadableStream<Uint8Array>>,
logger: any,
identifier: Record<string, any> = {},
forceStreaming?: boolean
): Promise<Response> {
try {
// If forceStreaming is true, skip the non-streaming attempt
if (!forceStreaming) {
// First attempt without streaming
const result = await operation(false);
return new Response(JSON.stringify(result), {
headers: { 'Content-Type': 'application/json' }
});
}

// Direct streaming or fallback after error
const streamResult = await operation(true);
if (!(streamResult instanceof ReadableStream)) {
return new Response(JSON.stringify(streamResult));
}

// Return the stream directly
return new Response(streamResult, {
headers: { 'Content-Type': 'application/json' }
});

} catch (error) {
logger.debug('Operation error', { error, forceStreaming, ...identifier });

// Check if it's the RPC serialization error and we haven't tried streaming yet
if (!forceStreaming &&
error instanceof Error &&
error.message.includes('Serialized RPC arguments or return values are limited')) {
logger.warn('Retrying with streaming due to size limitation', { ...identifier });
// Recursively call with forceStreaming
return withStreamProcessing(operation, logger, identifier, true);
}

logger.error('Operation failed:', { error, forceStreaming, ...identifier });
throw error;
}
}
export async function withStreamProcessing(
operation: (streaming: boolean) => Promise<unknown | ReadableStream<Uint8Array>>,
logger: any,
identifier: Record<string, any> = {},
forceStreaming?: boolean
): Promise<Response> {
try {
// If forceStreaming is true, skip the non-streaming attempt
if (!forceStreaming) {
// First attempt without streaming
const result = await operation(false);
return new Response(JSON.stringify(result), {
headers: { 'Content-Type': 'application/json' }
});
}

// Direct streaming or fallback after error
const streamResult = await operation(true);
if (!(streamResult instanceof ReadableStream)) {
return new Response(JSON.stringify(streamResult));
}

// Return the stream directly
return new Response(streamResult, {
headers: { 'Content-Type': 'application/json' }
});

} catch (error) {
logger.debug('Operation error', { error, forceStreaming, ...identifier });

// Check if it's the RPC serialization error and we haven't tried streaming yet
if (!forceStreaming &&
error instanceof Error &&
error.message.includes('Serialized RPC arguments or return values are limited')) {
logger.warn('Retrying with streaming due to size limitation', { ...identifier });
// Recursively call with forceStreaming
return withStreamProcessing(operation, logger, identifier, true);
}

logger.error('Operation failed:', { error, forceStreaming, ...identifier });
throw error;
}
}
17 replies