Having problems working with the `Llama-2-7b-chat-hf`
I have the following request going to the
runsync
endpoint.
(Full request here: https://pastebin.com/FLqjRzRG)
this is the result:
This is a very bad answer: " Sure! Here are all the places and year numbers listed in the text:\n"`
What am I missing?
Thanks5 Replies
It’s because your output tokens is set to 16. You should send a bunch of parameters too, not just the prompt.
This is all the stuff I send alongside my prompt.
Thank you. Can i see your complete json?
Thanks again
Here, you should be able to figure it out from this
Appreciated