RunpodR
Runpod2y ago
avif

Having problems working with the `Llama-2-7b-chat-hf`

I have the following request going to the
runsync
endpoint.
{
    "input": {
        "prompt": "the context. Give me all the places and year numbers listed in the text above"
    }
}

(Full request here: https://pastebin.com/FLqjRzRG)

this is the result:
{
    "delayTime": 915,
    "output": {
        "input_tokens": 794,
        "output_tokens": 16,
        "text": [
            " Sure! Here are all the places and year numbers listed in the text:\n"
        ]
    },
    "status": "COMPLETED"
}

This is a very bad answer: " Sure! Here are all the places and year numbers listed in the text:\n"`
What am I missing?
Thanks
Was this page helpful?