Cloudflare Developers

CD

Cloudflare Developers

Welcome to the official Cloudflare Developers server. Here you can ask for help and stay updated with the latest news

Join

Express ish Next js API routes

Express-ish, Next.js API-routes

Hello I m trying to upload a file with a

Hello, I'm trying to upload a file with a pre-signed URL to R2 using the browser and I'm getting a 403 CORS error. Does anyone know how to solve this problem? My CORS policy: [ { "AllowedOrigins": [...

Hi i m trying to create a bucket through

Hi, i'm trying to create a bucket through command line: wrangler r2 create bucket NAME and I'm getting a ``` ⛅️ wrangler 3.1.1 ------------------ Creating bucket NAME. ...

I m getting really slow download speeds

I'm getting really slow download speeds from R2 directly from the s3 endpoint. It's taking 2000ms to pull 2 1kb parquet files, which should be like 50ms. Is there anything I can check? For example testing with minio I get them in 50ms.

I m hoping someone who works for

I'm hoping someone who works for cloudflare will see this: I'm seeing consistent 500 Server errors when trying to use the R2 S3 API. This started at the beginning of the month and hasn't changed. It was working for me before with no issues, and i didn't change anything. It just started giving me 500 errors. I can see the API hits in the Metrics tab of the web interface but nothing gets stored or retrieved....

With ` response content encoding `

With ?response-content-encoding= ```sh âžś ~ file test.gz test.gz: gzip compressed data, from Unix, original size modulo 2^32 1684...

The docs page you linked has an example

The docs page you linked has an example of that

sid 3835 no im using module helper for

@sdnts no im using module helper for S3 in elixir ```config = %{ region: @s3_region, access_key_id: System.fetch_env!("AWS_ACCESS_KEY_ID"), secret_access_key: System.fetch_env!("AWS_SECRET_ACCESS_KEY") }...

Is it better to create a “general help”

S3 upload speeds (slow) I’ve also been waiting for 2 weeks on my CloudFlare tickets. Wait times are insane, even when we’re paying for over $100 (increasing every month) on CloudFlare products per month...

this is the part you use dev tools and

this is the part you use dev tools and steal from the dashboard for

but it s just the files that are

but it's just the files that are actually on the bucket. In my local wrangler setup, the files will never be uploaded there because I have them locally in .wrangler/state/r2 then. But someone in the #wrangler channel suggested to just serve those files with any file server which I'll try now

CORS from Webworker?

Actually it works using the direct URL but here I'm preloading a bunch of videos in a WebWorker: ```js const preloadVideo = async (url: string) => { const res = await fetch(url) const blob = await res.blob()...

FileZilla

we did try this (with Filezilla Pro) but it is not working, hence wanted to see if anything specific documentation is present. I will review our settings again

minio .net

personally I've used minIO for .net without any issue, potentially another option: https://min.io/docs/minio/linux/developers/dotnet/minio-dotnet.html

I mean the code is working it returns a

I mean, the code ""is working"", it returns a signed URL which seems to be invalid

Please help out here

Please help out here. So I have moved alot of our data from S3 to using R2. Now everything is working, on django I have made a custom storage class, and some models are using it. While everything is working there is one small issue. I am trying to get the file size but when I try to get the file size I get the 400 head object error. And this happens on the endpoint url and not on the custom domain. How to force botocore to use the custom domain, else where its using custom domain as it is...so why here....

You can import render as a library do

You can import render as a library, do your own auth and then pass it off to R2 as render.fetch(req, env, ctx);

Is there any ETA for per bucket tokens

Is there any ETA for per-bucket tokens? We want our data suppliers to upload directly to R2 (can't use browsers because it's many TB per dataset), but for that to work we need tokens that only have write access to a specific bucket

If it returns it deleted if it throws an

If it returns it deleted if it throws an error it didn’t

Hi all am currently trying to perform a

Hi all, am currently trying to perform a multi file upload using cloudflare workers and R2 but I keep getting this error
EntityTooSmall: Your proposed upload is smaller than the minimum allowed object size.
EntityTooSmall: Your proposed upload is smaller than the minimum allowed object size.
...