workers ai

so im having problem with the LLaMA ai so for some reason it doesnt listen to the prompts here is an example: [{"inputs":{"prompt":"what is python"}},{"inputs":{"response":"Hello! I'm LLaMA, an AI assistant developed by Meta AI that can understand and respond to human input in a conversational manner. I'm here to help you with any questions or topics you'd like to discuss. Is there something specific you'd like to talk about or ask?"}}] here is the code
import { Ai } from './vendor/@cloudflare/ai.js';

export default {
async fetch(request, env) {
const tasks = [];
const ai = new Ai(env.AI);

const url = new URL(request.url);
const query = url.searchParams.get('q');

let simple = {
prompt: query || 'Tell me a joke about Cloudflare'
};
let response = await ai.run('@cf/meta/llama-2-7b-chat-int8', simple);
tasks.push({ inputs: simple });

let chat = {
messages: [
{ role: 'system', content: 'you are an chatbot to chat with user' },
]
};
response = await ai.run('@cf/meta/llama-2-7b-chat-int8', chat);
tasks.push({ inputs: response });

return Response.json(tasks);
}
};
import { Ai } from './vendor/@cloudflare/ai.js';

export default {
async fetch(request, env) {
const tasks = [];
const ai = new Ai(env.AI);

const url = new URL(request.url);
const query = url.searchParams.get('q');

let simple = {
prompt: query || 'Tell me a joke about Cloudflare'
};
let response = await ai.run('@cf/meta/llama-2-7b-chat-int8', simple);
tasks.push({ inputs: simple });

let chat = {
messages: [
{ role: 'system', content: 'you are an chatbot to chat with user' },
]
};
response = await ai.run('@cf/meta/llama-2-7b-chat-int8', chat);
tasks.push({ inputs: response });

return Response.json(tasks);
}
};
3 Replies
GGLVXD
GGLVXDOP14mo ago
also what are the limits of llama ai
Tin Cap
Tin Cap13mo ago
You're querying the AI twice. Once with your query and once again with you are an chatbot to chat with user. You're outputting the result of your second query. Given this your results make sense.
Want results from more Discord servers?
Add your server