interesting_friend_5
interesting_friend_5
RRunPod
Created by Builderman on 2/19/2024 in #⚡|serverless
Mixtral Possible?
prompt = "Tell me about AI" prompt_template=f'''[INST] {prompt} [/INST] ''' prompt = prompt_template.format(prompt=prompt) payload = { "input": { "prompt": prompt, "sampling_params": { "max_tokens": 1000, "n": 1, "presence_penalty": 0.2, "frequency_penalty": 0.7, "temperature": 1.0, } } }
21 replies
RRunPod
Created by Builderman on 2/19/2024 in #⚡|serverless
Mixtral Possible?
I've set the environment variables MODEL_NAME=TheBloke/Mixtral-8x7B-Instruct-v0.1-AWQ and QUANTIZATION=awq. I've got no other custom commands.
21 replies
RRunPod
Created by Builderman on 2/19/2024 in #⚡|serverless
Mixtral Possible?
Awesome! Thank you!
21 replies
RRunPod
Created by Builderman on 2/19/2024 in #⚡|serverless
Mixtral Possible?
21 replies
RRunPod
Created by Builderman on 2/19/2024 in #⚡|serverless
Mixtral Possible?
I have been trying to run Mixtral AWQ but am not getting any results returned in the completed message. I had not trouble with Llama 2, but am struggling to get Mixtral working. Anyone else have this issue?
21 replies