Hey, trying to understand - you're using multiple azure openai models, so that's likely setting a d
Hey, trying to understand
- you're using multiple azure openai models, so that's likely setting a different cache key each time
- you'd like to get the same response each time though, because it's the same prompt, even though it's using different models
is that correct?
-then for your last bit. using cf-skip-cache: true means you'll bypass the cached version and fetch directly from provider. are you saying you got the same response each time even though you were using cf-skip-cache: true?
16 Replies
oh thanks
You mean Google AI Studio?
Yep
Hi. noob question... does AI gateway forward client ip address to OpenAI (or other AI API providers)?
hey! i assume this is in reference to the openai email that was sent. we are in contact with openai to make sure ai gateway customers are properly supported
Thanks for the info:)
When using AI Gateway with Vertex AI (Gemini 1.5 Pro), it seems like streaming is broken. It buffers everything.
@Kathy | AI Gateway PM AI Gateway is forcing
v1/
on Vertex AI, even if I pass in v1beta1/
how to use workers-ai
Are you talking about how to use workers ai with ai gateway? Or how to use workers ai in the first place?
idk what that is so i am thinking its some kind of AI service endpoint smth that we can use 😅
Yes, workers ai is cloudflare's ai service and have their own and (recently added) openai compatible endpoint. However, I would suggest to move further conversation about it to #workers-ai. Also in that channel, if you click at the description of the channel at the top, there's a link that explains more about it and how it works
what's your pfp 💀
Unknown User•6mo ago
Message Not Public
Sign In & Join Server To View
yes hopefully soon
am I doing something wrong, or did Azure OpenAI embeddings through AI Gateway break?
oh wait I am doing something wrong I think
looks like LiteLLM
ya, LiteLLM bug. Sorry! https://github.com/BerriAI/litellm/pull/4629