I am not able to hit the api in serverless ollama server llama3.2 model , Here is the screenshot

No description
2 Replies
yhlong00000
yhlong000003w ago
Looks like Your payload missing input
SvenBrnn
SvenBrnn3w ago
which serverless ollama repository are you using as a base? The one i found in runpod blog did not really work well anymore so i ended up building my own: See https://discord.com/channels/912829806415085598/1334451089256480768/1334451089256480768 If you still got problems with this i can try to help

Did you find this page helpful?