Cloudflare Developers

CD

Cloudflare Developers

Welcome to the official Cloudflare Developers server. Here you can ask for help and stay updated with the latest news

Join

I have cache headers from the request on

I have cache headers from the request on a mp4 file:
Cache-Control: max-age=2678400 Cf-Cache-Status: HIT
https://assets.flayks.com/4thsex-screencast.mp4...

Hi Team I need some help with building a

Hi Team, I need some help with building a custom hls streaming service using R2. I'm aware of the streaming service provided by cloudlfare but i want to build a custom one on R2. I need help to implement streaming of multiple hls files through r2. Do i need to create a presigned url for all the files or is there a better way considering my bucket is not public? Please help!!.

Dangling domain

Is there someone from CF I can give my account ID and bucket name to to have a quick look at it? 🙏

I m experiencing a bug when attaching a

I'm experiencing a bug when attaching a domain to an R2 bucket, which may be specific to specificly related to Punycode/IDN (Internationalized Domain Names). The DNS records are successfully added and the TLS cert and https endpoint are initialized, but the website serves Error 1014. Additionally, an error message is shown when trying to "reconnect" the R2 custom domain using the "..." menu. Lastly, the DNS entries are not deleted (and the UI provides no options to delete if we delete the R2 custom domain, as shown by orphaned DNS entry bin.føø.com (bin.xn--f-5gaa.com)....

CORS

Still no solution for this?

I just created a new R2 bucket within

I just created a new R2 bucket within the EU jurisdiction, but the dashboard is showing internal errors (code 10001) for the policy endpoint. If I create a bucket with an automatic location, this does not happen. Is this a known issue?

You can use something like Rclone to

You can use something like Rclone to sync two R2 buckets (run it on VM server or something for speed if your bucket is very large)

How to connect your Cloudflare R2 Storage to CyberDuck

How to connect your Cloudflare R2 Storage to CyberDuck For an easier way to play around with your R2 Bucket! If you haven't already, download your flavour (version) with the link below ⬇️ https://cyberduck.io/download...
No description

Any of the SDKs compatible with AWS S3

Any of the SDKs compatible with AWS S3 will work by changing out the URL

speed deviation with custom domain

Hello, trying to sending objects ( <= 100 MB) with custom domain. The download speed deviation is severe on the same device, COLO. (40 mb/s to 6 mb/s, HKG) I checked with several users and found that some users were very fast or slow even though COLO was different. (mostly HKG, KIX)...

I rolled out presigned URL support to my

I rolled out presigned URL support to my users (users being site owners that use my Cloudflare integration system, not end users) a few days ago and I’m already getting questions from site owners asking why presigned URLs only work over HTTP/1.1. I know the S3 API is limited to HTTP/1.1, but it would be nice/have more noticeable benefits for presigned URLs if it had the capability to do HTTP/2 or HTTP/3. You end up with end users making multiple http requests concurrently. Hopefully it’s on the roadmap. It does seem silly when a site is running over HTTP/3, but then some attachments coming via presigned URLs are HTTP/1.1...

Hi I m trying to integrate R2 into my

Hi! I'm trying to integrate R2 into my elixir app. I found this thread https://discord.com/channels/595317990191398933/1114222075268321322/1114246362284970036 but the person there says that it just worked. I keep getting 400 for streaming a file. Is that not supported yet?

also does cloudflare r2 store my data in

also, does cloudflare r2 store my data in multiple regions? (i don't refer to caching, i mean a backup)

I already try then cursor freezes and no

I already try then cursor freezes and no progress to show ,in cloudflare same storage used

Hello Unless I m missing something

Hello! Unless I'm missing something obvious, I'm pretty sure I'm running into an R2 bug. I'm using a token with "Object Read & Write" permissions and get AccessDenied when using rclone to write an object to the bucket. I know I have the access id & secret key correct because as soon as I switch the token permissions to "Admin Read & Write" privileges, it starts working and rclone will correctly write new objects to the bucket. When I switch it back, it stops working again. What I'm doing is very simple: ``` sergey@ark ~> echo Hello World > myfile.txt...

Concurrency limits

Ya... the setup is 2 buckets. One for the public stuff (and it is indeed accessed directly on a public domain and optimized appropriately with Cache Rules and Tiered Cache). That's for things intended to be public (things like user avatars). The private bucket that is accessed via the API is the one he's having a problem with. The objects there are accessed via the API because the ability to view/download those are user permission based. The application checks for the appropriate permissions for the user that's logged in and then passed it through as necessary based on the permissions. He isn't doing a crazy amount of traffic or anything... he's at ~19M class B operations this month across all his buckets. The issue here is with the private bucket... since it's user permission based, the object is passed through via API. It's no where remotely close to 1,000 concurrent read operations even across all objects/users, I'd guess maybe it peaks around 10ish....

Presigned URLs

I need some help -- I'm trying to generate a presigned upload URL with R2 and I keep getting this error: "The request signature we calculated does not match the signature you provided. Check your secret access key and signing method." My go code is as follows:...