I am not able to hit the api in serverless ollama server llama3.2 model , Here is the screenshot

2 Replies
Looks like Your payload missing input
which serverless ollama repository are you using as a base? The one i found in runpod blog did not really work well anymore so i ended up building my own: See https://discord.com/channels/912829806415085598/1334451089256480768/1334451089256480768
If you still got problems with this i can try to help