N
Nuxt2w ago
Eric

read stream from event twice

I'm trying to find a way to read a stream (from an h3 event ) twice. Once to chez its size and once to actually use it. I tried this: but .values doesn't exist on req
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;

for await (const chunk of event.node.req.values({ preventCancel: true })) {
totalSize += chunk.length;

// If total size exceeds limit, throw an error
if (totalSize > MAX_SIZE) {
console.error('Request too large');
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}
}
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;

for await (const chunk of event.node.req.values({ preventCancel: true })) {
totalSize += chunk.length;

// If total size exceeds limit, throw an error
if (totalSize > MAX_SIZE) {
console.error('Request too large');
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}
}
I tried creating a readableStream, but then I can't find how to actually read it. I tried to create a new h3event with createEvent, but I can't figure out how to do it properly. So, is there a way to read the stream twice ? Is there a way to read a readableStream ? What should I do ? I want to measure the incoming stream to prevent gigantic files from getting uploaded From what I've found, just checking the .size value might not be completely safe. By reading it chunk by chunk, we can stop the stream if the limit is reached, and the server wont have to store 100gb in case of an attack. Does that make sense ? is it stupid ?
9 Replies
kapa.ai
kapa.ai2w ago
Hi @Eric. I'm kapa.ai, your friendly support bot. I will try to answer your question by searching through the knowledge base. :wave: ---------------------- Your question is about handling streams in the context of Nuxt and H3, which is relevant to my expertise. Let me address your concerns based on the provided knowledge sources. Your approach of trying to read the stream twice to check its size and then use it is on the right track, but there are some issues with the implementation. Let's break down the problem and provide a solution. 1. Reading the stream twice: Unfortunately, streams are typically consumed only once. Once you've read from a stream, you can't rewind it. This is why your attempt to read it twice doesn't work as expected. 2. Checking the size: Your reasoning for wanting to check the size chunk by chunk is correct. It's a good practice to prevent potential attacks or unintended large file uploads. 3. Implementation: Instead of trying to read the stream twice, you can use the readRawBody utility from H3, which allows you to read the body of the request as a stream. Here's how you can implement this:
import { defineEventHandler, readRawBody, createError } from 'h3'

export default defineEventHandler(async (event) => {
const MAX_SIZE = 10 * 1024 * 1024 // 10MB limit
let totalSize = 0

const body = await readRawBody(event, 'utf-8', (chunk) => {
totalSize += chunk.length
if (totalSize > MAX_SIZE) {
throw createError({
statusCode: 413,
statusMessage: 'Request too large'
})
}
})

// If we've reached here, the body size is within limits
// You can now use `body` for further processing

return { ok: true, body }
})
import { defineEventHandler, readRawBody, createError } from 'h3'

export default defineEventHandler(async (event) => {
const MAX_SIZE = 10 * 1024 * 1024 // 10MB limit
let totalSize = 0

const body = await readRawBody(event, 'utf-8', (chunk) => {
totalSize += chunk.length
if (totalSize > MAX_SIZE) {
throw createError({
statusCode: 413,
statusMessage: 'Request too large'
})
}
})

// If we've reached here, the body size is within limits
// You can now use `body` for further processing

return { ok: true, body }
})
This implementation uses the readRawBody utility from H3, which is documented in the H3 Body utils section. The readRawBody function allows you to provide a callback that is called for each chunk of the body as it's being read. This enables you to check the size as the stream is being consumed. Your approach of checking the size chunk by chunk is indeed a good practice. It allows you to stop processing as soon as the limit is reached, without having to store the entire payload in memory. This is particularly useful for preventing potential attacks or unintended large file uploads, as you mentioned. To answer your specific questions: 1. Is there a way to read the stream twice? Generally, no. Streams are typically consumed once. Instead, you should process the stream as you read it, as shown in the example above. 2. Is there a way to read a ReadableStream? Yes, you can use the readRawBody utility from H3 to read a ReadableStream. 3. What should you do? Use the readRawBody utility as shown in the example above. This allows you to check the size as you read the stream and process it in one go. 4. Does your approach make sense? Is it stupid? Your approach is not stupid at all. In fact, it's a good practice for handling potentially large uploads securely. Checking the size as you read the stream is more efficient and secure than reading the entire stream into memory before checking its size. Remember to handle errors appropriately and send proper responses to the client when limits are exceeded or other errors occur. H3 Body utils ---------------------- Note: you can ask me a follow up question by @ mentioning me again :speech_balloon: ----------------------
kapa.ai
kapa.ai2w ago
<@776374982275497985> Kapa.ai is still learning and improving, please let me know how I did by reacting below.
Omnislash
Omnislash2w ago
Hello What do you mean by use it ? Can you just start stacking it into a Buffer and stop if it becomes too large ? Sometimes the source includes a Content-Length, did you check just in case of ?
Eric
EricOP2w ago
HI ! thanks for you help ! Yes I tried a loop that reads the stream chunk by chunk, store everything in a variable, adds up the total size and stop if the size reaches the max size. But i couldn't figure out what to do with it after.
I was thinking of adding a check to Content-Length too, but it's bot perfectly safe because it is set client side. A malicious usr could temper with it.
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;
let chunks: Buffer[] = [];

const reader = event.node.req;

for await (const chunk of reader) {
totalSize += chunk.length;

if (totalSize > MAX_SIZE) {
console.error('Request too large')
return {
ok: false,
status: 413,
statusText: 'Request too large',
}
}

chunks.push(chunk); // Store chunk in memory for later processing
}
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;
let chunks: Buffer[] = [];

const reader = event.node.req;

for await (const chunk of reader) {
totalSize += chunk.length;

if (totalSize > MAX_SIZE) {
console.error('Request too large')
return {
ok: false,
status: 413,
statusText: 'Request too large',
}
}

chunks.push(chunk); // Store chunk in memory for later processing
}
this reads, measure and stores the data. But what do I do with chunks after ?
Omnislash
Omnislash2w ago
Well it depends, what is the blob/file about ? What do you want to do with it ? For exemple if it's an image you want to store on a S3, you could : - Use somthing like putstream to put it chunk by chunk (and cancel it if too large or something) - concat the chunks and upload it in one go with a basic put
Eric
EricOP2w ago
the incoming data is a formData that needs to be sent to a Directus server. It contains data ile id, userid, etc.. and an image. I need to parse it to make two different requests to Directus. Before trying to add the size check , I read it to obtain an array aof values , filter it to create the two request. once chunks is concatenated , it is still not readable, right ? do you know how I can read it to turn it into an array of usable data objects ?
Omnislash
Omnislash2w ago
sorry I use adonis as a backend, I do use the chunks on the file part but it treats the data for me on the parameters part ^^"
Mittchel
Mittchel2w ago
To handle the chunks after reading them, you need to concatenate them into a single Buffer and then parse the data. Since the incoming data is formData, you can use a library like formidable or busboy to parse the multipart form data. Here's how you can do it:
import { IncomingForm } from 'formidable';

const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;
let chunks: Buffer[] = [];

const reader = event.node.req;

for await (const chunk of reader) {
totalSize += chunk.length;

if (totalSize > MAX_SIZE) {
console.error('Request too large');
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}

chunks.push(chunk); // Store chunk in memory for later processing
}

// Concatenate all chunks into a single Buffer
const buffer = Buffer.concat(chunks);

// Parse the form data
const form = new IncomingForm();

form.parse(buffer, (err, fields, files) => {
if (err) {
console.error('Error parsing form data:', err);
return {
ok: false,
status: 500,
statusText: 'Internal Server Error',
};
}
import { IncomingForm } from 'formidable';

const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;
let chunks: Buffer[] = [];

const reader = event.node.req;

for await (const chunk of reader) {
totalSize += chunk.length;

if (totalSize > MAX_SIZE) {
console.error('Request too large');
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}

chunks.push(chunk); // Store chunk in memory for later processing
}

// Concatenate all chunks into a single Buffer
const buffer = Buffer.concat(chunks);

// Parse the form data
const form = new IncomingForm();

form.parse(buffer, (err, fields, files) => {
if (err) {
console.error('Error parsing form data:', err);
return {
ok: false,
status: 500,
statusText: 'Internal Server Error',
};
}
// Now you have the fields and files
const { id, userId } = fields;
const imageFile = files.image; // Assuming the file input field is named 'image'

// Make your requests to Directus here
// Example: send fields to Directus
fetch('https://your-directus-server/items/your-collection', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ id, userId }),
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});

// Example: upload image to Directus
const formData = new FormData();
formData.append('file', imageFile);

fetch('https://your-directus-server/files', {
method: 'POST',
body: formData,
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});
});
// Now you have the fields and files
const { id, userId } = fields;
const imageFile = files.image; // Assuming the file input field is named 'image'

// Make your requests to Directus here
// Example: send fields to Directus
fetch('https://your-directus-server/items/your-collection', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ id, userId }),
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});

// Example: upload image to Directus
const formData = new FormData();
formData.append('file', imageFile);

fetch('https://your-directus-server/files', {
method: 'POST',
body: formData,
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});
});
With bussboy
import Busboy from 'busboy';

const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;
let chunks: Buffer[] = [];

const reader = event.node.req;

for await (const chunk of reader) {
totalSize += chunk.length;

if (totalSize > MAX_SIZE) {
console.error('Request too large');
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}

chunks.push(chunk); // Store chunk in memory for later processing
}

// Concatenate all chunks into a single Buffer
const buffer = Buffer.concat(chunks);

// Create a Busboy instance to parse the form data
const busboy = Busboy({ headers: event.node.req.headers });

const fields: Record<string, string> = {};
const files: Record<string, { filename: string; data: Buffer }> = {};

// Handle field events (non-file fields)
busboy.on('field', (fieldname, value) => {
fields[fieldname] = value;
});

// Handle file events
busboy.on('file', (fieldname, file, filename) => {
const chunks: Buffer[] = [];
file.on('data', (chunk) => {
chunks.push(chunk);
});
file.on('end', () => {
files[fieldname] = {
filename,
data: Buffer.concat(chunks),
};
});
});
import Busboy from 'busboy';

const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;
let chunks: Buffer[] = [];

const reader = event.node.req;

for await (const chunk of reader) {
totalSize += chunk.length;

if (totalSize > MAX_SIZE) {
console.error('Request too large');
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}

chunks.push(chunk); // Store chunk in memory for later processing
}

// Concatenate all chunks into a single Buffer
const buffer = Buffer.concat(chunks);

// Create a Busboy instance to parse the form data
const busboy = Busboy({ headers: event.node.req.headers });

const fields: Record<string, string> = {};
const files: Record<string, { filename: string; data: Buffer }> = {};

// Handle field events (non-file fields)
busboy.on('field', (fieldname, value) => {
fields[fieldname] = value;
});

// Handle file events
busboy.on('file', (fieldname, file, filename) => {
const chunks: Buffer[] = [];
file.on('data', (chunk) => {
chunks.push(chunk);
});
file.on('end', () => {
files[fieldname] = {
filename,
data: Buffer.concat(chunks),
};
});
});
// Handle the end of the form
busboy.on('finish', () => {
// Now you have the fields and files
const { id, userId } = fields;
const imageFile = files.image; // Assuming the file input field is named 'image'

// Make your requests to Directus here
// Example: send fields to Directus
fetch('https://your-directus-server/items/your-collection', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ id, userId }),
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});

// Example: upload image to Directus
const formData = new FormData();
formData.append('file', new Blob([imageFile.data]), imageFile.filename);

fetch('https://your-directus-server/files', {
method: 'POST',
body: formData,
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});
});

// Write the buffer to busboy for parsing
busboy.end(buffer);
// Handle the end of the form
busboy.on('finish', () => {
// Now you have the fields and files
const { id, userId } = fields;
const imageFile = files.image; // Assuming the file input field is named 'image'

// Make your requests to Directus here
// Example: send fields to Directus
fetch('https://your-directus-server/items/your-collection', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ id, userId }),
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});

// Example: upload image to Directus
const formData = new FormData();
formData.append('file', new Blob([imageFile.data]), imageFile.filename);

fetch('https://your-directus-server/files', {
method: 'POST',
body: formData,
})
.then(response => response.json())
.then(data => {
console.log('Success:', data);
})
.catch((error) => {
console.error('Error:', error);
});
});

// Write the buffer to busboy for parsing
busboy.end(buffer);
Multipart Form Data with Nitro and h3
import { defineEventHandler, readMultipartFormData } from 'h3';

export default defineEventHandler(async (event) => {
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;

// Read and parse the multipart form data
const parts = await readMultipartFormData(event);

if (!parts) {
return {
ok: false,
status: 400,
statusText: 'No form data found',
};
}

const fields: Record<string, string> = {};
const files: Record<string, { filename: string; data: Buffer }> = {};

for (const part of parts) {
// Check if the part is a field (non-file)
if (!part.filename) {
fields[part.name] = part.data.toString('utf-8'); // Convert buffer to string
} else {
// Handle file parts
totalSize += part.data.length;

if (totalSize > MAX_SIZE) {
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}

files[part.name] = {
filename: part.filename,
data: part.data,
};
}
}

// Now you have the fields and files
const { id, userId } = fields;
const imageFile = files.image; // Assuming the file input field is named 'image'
import { defineEventHandler, readMultipartFormData } from 'h3';

export default defineEventHandler(async (event) => {
const MAX_SIZE = 10 * 1024 * 1024; // 10MB limit
let totalSize = 0;

// Read and parse the multipart form data
const parts = await readMultipartFormData(event);

if (!parts) {
return {
ok: false,
status: 400,
statusText: 'No form data found',
};
}

const fields: Record<string, string> = {};
const files: Record<string, { filename: string; data: Buffer }> = {};

for (const part of parts) {
// Check if the part is a field (non-file)
if (!part.filename) {
fields[part.name] = part.data.toString('utf-8'); // Convert buffer to string
} else {
// Handle file parts
totalSize += part.data.length;

if (totalSize > MAX_SIZE) {
return {
ok: false,
status: 413,
statusText: 'Request too large',
};
}

files[part.name] = {
filename: part.filename,
data: part.data,
};
}
}

// Now you have the fields and files
const { id, userId } = fields;
const imageFile = files.image; // Assuming the file input field is named 'image'
// Example: Send fields to Directus
const directusFieldsResponse = await $fetch('https://your-directus-server/items/your-collection', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ id, userId }),
});

console.log('Directus fields response:', directusFieldsResponse);

// Example: Upload image to Directus
const formData = new FormData();
formData.append('file', new Blob([imageFile.data]), imageFile.filename);

const directusFileResponse = await $fetch('https://your-directus-server/files', {
method: 'POST',
body: formData,
});

console.log('Directus file response:', directusFileResponse);

return {
ok: true,
status: 200,
statusText: 'Form data processed successfully',
fields,
files: Object.keys(files), // Return filenames for reference
};
});
// Example: Send fields to Directus
const directusFieldsResponse = await $fetch('https://your-directus-server/items/your-collection', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({ id, userId }),
});

console.log('Directus fields response:', directusFieldsResponse);

// Example: Upload image to Directus
const formData = new FormData();
formData.append('file', new Blob([imageFile.data]), imageFile.filename);

const directusFileResponse = await $fetch('https://your-directus-server/files', {
method: 'POST',
body: formData,
});

console.log('Directus file response:', directusFileResponse);

return {
ok: true,
status: 200,
statusText: 'Form data processed successfully',
fields,
files: Object.keys(files), // Return filenames for reference
};
});
Eric
EricOP2w ago
Thanks guys for your help. I had to go to another project , so I couldn't set this up, but I'll come back to it in 2 weeks. Thanks again!!

Did you find this page helpful?