Hi, I have a problem with Cloudflare

Hi, I have a problem with Cloudflare Workers + Hono + Cloudflare AI, I have a route defined (i.e. /v1/chat/completions) and when the user triggers that route I try to run env.AI.run([model], [etc]), but I get this error: TypeError: Cannot read properties of undefined (reading 'AI'). This is my Env interface:
interface Env {
AI: any;
}

const env = (globalThis as any).env as Env
interface Env {
AI: any;
}

const env = (globalThis as any).env as Env
Also tried removing the const env = [...] and running AI inside a function with function exampleFunction(env: Env) but it also didn't work.
18 Replies
James
James7mo ago
env isn't available on globalThis. It's in your fetch handler, or in the case of Hono, available on context.env.
kyv0
kyv0OP7mo ago
So I could call context.env.AI?
James
James7mo ago
Yep. Something like this (pseudo):
type Bindings = {
AI: Ai;
}

const app = new Hono<{ Bindings: Bindings }>()

app.get('/foo', async (context) => {
const data = await context.env.AI.run(...);

// do something
})

export default app
type Bindings = {
AI: Ai;
}

const app = new Hono<{ Bindings: Bindings }>()

app.get('/foo', async (context) => {
const data = await context.env.AI.run(...);

// do something
})

export default app
kyv0
kyv0OP7mo ago
Thank you very much gonna try rn
James
James7mo ago
https://hono.dev/getting-started/cloudflare-workers is a great docs reference if you hit any issues
kyv0
kyv0OP7mo ago
okay now the function is working, but Typescript says that '@cf/tinyllama/tinyllama-1.1b-chat-v1.0' (example model) isn't any of the posible models (""@cf/unum/uform-gen2-qwen-500m" | "@cf/llava-hf/llava-1.5-7b-hf"")
Isaac McFadyen
Isaac McFadyen7mo ago
Make sure your version of @cloudflare/workers-types is up to date in your package.json
kyv0
kyv0OP7mo ago
just updated yeah i'm on the latest version okay this is weird if I deploy the code it just works???? but TypeScript says it shouldn't
James
James7mo ago
the types for Workers AI can be a little weird. Can you share what you're passing to AI.run()?
kyv0
kyv0OP7mo ago
sure, one moment
const answer = await c.env.AI.run('@cf/tinyllama/tinyllama-1.1b-chat-v1.0', {
messages: [
{
role: "system",
content: `[very large system prompt]`
},
{
role: "user",
content: `How many pounds of food does a person eat in a year?`
}
],
stream: false
})
const answer = await c.env.AI.run('@cf/tinyllama/tinyllama-1.1b-chat-v1.0', {
messages: [
{
role: "system",
content: `[very large system prompt]`
},
{
role: "user",
content: `How many pounds of food does a person eat in a year?`
}
],
stream: false
})
this is literally everything and after that I have a return c.json({ response: answer }) I don't even look for messages from the user because I need it to work first lol edit: stream is a boolean constant that I set to false, I edited it for clarity
James
James7mo ago
something isn't matching up to hit this function overload: https://github.com/cloudflare/workerd/blob/b3c613b9a5e3265a7e468e7c62d1a14df1e057df/types/defines/ai.d.ts#L197 I can't see exactly what in your snippet, but something isn't quite accurate. At runtime, it's likely just being ignored, but the types get more angry
kyv0
kyv0OP7mo ago
I got into a wrangler tail console, and this is what I got from the worker:
@ 28/5/2024, 21:48:32
(log) Model: @cf/tinyllama/tinyllama-1.1b-chat-v1.0 ---> debug console.log
(error) AiError: 5006: must have required property 'prompt', must be boolean, must be array, must be boolean, must match exactly one schema in oneOf
@ 28/5/2024, 21:48:32
(log) Model: @cf/tinyllama/tinyllama-1.1b-chat-v1.0 ---> debug console.log
(error) AiError: 5006: must have required property 'prompt', must be boolean, must be array, must be boolean, must match exactly one schema in oneOf
maybe messages isn't supported anymore? and now I have to use prompt lol
kyv0
kyv0OP7mo ago
I don't think so
No description
James
James7mo ago
I'd recommend running with wrangler dev locally for testing - that should be much quicker for testing
Isaac McFadyen
Isaac McFadyen7mo ago
So prompt is for when you're doing raw prompt input yourself - for example, if you're manually formatting ChatML. You should be able to use messages for that model though, odd that the types aren't allowing you to.
kyv0
kyv0OP7mo ago
I tried to use it, but then I remembered that I'm using postman for testing, so I couldn't use 127.0.0.1. the web version of postman, to clarify Time to move to JavaScript lol
James
James7mo ago
cc @michelle @Logan Grasby some good feedback in here about the types being really difficult to use/debug with them relying on function overloads
rayberra
rayberra7mo ago
Those errors can become really weird. I got hit with AiError: 5006: must have required property 'prompt', must be >= 1, must be >= 1, must match exactly one schema in oneOf when adding "seed: 0". (Trying to tell me that seed must be >= 1.)
Want results from more Discord servers?
Add your server