Streaming file from Client to Server to S3

I am working on a side project in which i need to upload data to an s3 bucket. The user will select a large file on the React frontend and upload it. The file is sent to an Express.js server that will pipe/stream the chunks to an s3 bucket as soon as they arrive. Is the formData that i send from the client to server a readableStream?
const handleFileUpload = async (event) => {
const formData = new FormData(event.currentTarget)
event.currentTarget.reset()
const { data } = await fetch(`${import.meta.env.VITE_SERVER_URL}/upload`, {
method: 'POST',
credentials: 'include',
body: formData,
}).then((res) => res.json())
}
...
<form onSubmit={handleFileUpload}>
<input type="file" name="file" required />
<button type="submit" disabled={loading}>
Upload
</button>
</form>
const handleFileUpload = async (event) => {
const formData = new FormData(event.currentTarget)
event.currentTarget.reset()
const { data } = await fetch(`${import.meta.env.VITE_SERVER_URL}/upload`, {
method: 'POST',
credentials: 'include',
body: formData,
}).then((res) => res.json())
}
...
<form onSubmit={handleFileUpload}>
<input type="file" name="file" required />
<button type="submit" disabled={loading}>
Upload
</button>
</form>
9 Replies
JulieCezar
JulieCezar7mo ago
The answer is yes. However, a tip. This is not a good approach because you will get charged twice for the upload bandwidth - and you have an additional unnecessary upload. 1. the upload to the server where your express server is 2. uploading from the server to S3 The better one would be to use Express to create a presignedPostURL, and using that you can upload directly from the client. If you want to make your life simpler use UploadThing, otherwise just use Uppy to upload with the presigned URL.
harshcut
harshcut7mo ago
Few questions: 1. When uploading a large file from the client to s3, what should be the expiry for the presignedPostURL? 2. I also want to save the file metadata on a MongoDB database. So, do I make a POST request again to Express to save data to MongoDB after the upload is complete? Or, maybe I could add upload_state as pending on the initial request to get the presignedPostURL? I am creating this app as an excersise to learn and understand streams and piping data, and won't probably use presigned url logic.
The answer is yes.
@JulieCezar I don't think the file is streamable when sending it using formData. Below is the code I'm using on my Express server and it throws an error.
import { Upload } from '@aws-sdk/lib-storage'
const upload = multer()
...
app.post('/upload', checkAuthentication, upload.single('file'), uploadStream)
...
export const uploadStream = async (req, res) => {
const fileId = crypto.randomUUID()
const parallelUploads3 = new Upload({
client: s3,
params: {
Bucket: bucketName,
Key: fileId,
Body: file.stream
// ?^ stream does not exist on file object
},
queueSize: 1,
})
}
import { Upload } from '@aws-sdk/lib-storage'
const upload = multer()
...
app.post('/upload', checkAuthentication, upload.single('file'), uploadStream)
...
export const uploadStream = async (req, res) => {
const fileId = crypto.randomUUID()
const parallelUploads3 = new Upload({
client: s3,
params: {
Bucket: bucketName,
Key: fileId,
Body: file.stream
// ?^ stream does not exist on file object
},
queueSize: 1,
})
}
if (typeof data.stream === "function") { ^ TypeError: Cannot read properties of undefined (reading 'stream')
kminator
kminator7mo ago
I don’t think the OP needs to change to use a pre-signed URL. Should be fine to just have the server do the upload to S3. it’s simpler Pros and cons to both approaches Where did you define the “data” variable in that code? The error says “data” is undefined.
harshcut
harshcut7mo ago
data is an internal variable in a library.
server:dev: /home/harsh/Documents/GitHub/s3-upload/node_modules/@aws-sdk/lib-storage/dist-cjs/index.js:160 server:dev: if (typeof data.stream === "function") { server:dev: ^ server:dev: TypeError: Cannot read properties of undefined (reading 'stream') server:dev: at getChunk (/home/harsh/Documents/GitHub/s3-upload/node_modules/@aws-sdk/lib-storage/dist-cjs/index.js:160:19) server:dev: at _Upload.__doMultipartUpload (/home/harsh/Documents/GitHub/s3-upload/node_modules/@aws-sdk/lib-storage/dist-cjs/index.js:374:24)
harshcut
harshcut7mo ago
No description
harshcut
harshcut7mo ago
also this is the console log from req.file
{
fieldname: 'file',
originalname: 'FILE_NAME.pdf',
encoding: '7bit',
mimetype: 'application/pdf',
buffer: <Buffer >,
size: 0
}
{
fieldname: 'file',
originalname: 'FILE_NAME.pdf',
encoding: '7bit',
mimetype: 'application/pdf',
buffer: <Buffer >,
size: 0
}
JulieCezar
JulieCezar7mo ago
1. Depends on you mostly... I put something like 10min - 1h depending on what I expect the files size to be just in case 2. Depends... 2.1 If using uploadthing you can controll that in the onUploadComplete callback 2.2. If you are using your server to upload, then after it's done you can just call Mongo 2.3 If you are doing all that on client, you need to wait for it to finish, then do a POST to some endpoint on your server to insert data Regarding your question about upload state property... you can but depending on the use case it might be unnecessary, especially if the files are small... then before you even see the "pending" state it will already change to "uploaded" - but again, depends on what you need.
JulieCezar
JulieCezar7mo ago
And you are probably getting the error because you forgot to put enctype='multipart/form-data' on your form https://blog.logrocket.com/multer-nodejs-express-upload-file/
Dillion Megida
LogRocket Blog
Multer: Easily upload files with Node.js and Express - LogRocket Blog
Multer comes in handy when forms contain multipart data that includes text inputs and files, which the body-parser library cannot handle.
Want results from more Discord servers?
Add your server