Heartthrob10
RRunPod
•Created by Heartthrob10 on 8/1/2024 in #⚡|serverless
how to set a max output token
This should do the job, let me try this
12 replies
RRunPod
•Created by Heartthrob10 on 8/1/2024 in #⚡|serverless
how to set a max output token
No, this is more relevant to the context length right. I'm talking about output tokens
12 replies
RRunPod
•Created by Heartthrob10 on 8/1/2024 in #⚡|serverless
how to set a max output token
I'm not using llama 3.1, it's the old llama 3
12 replies