aurelium
aurelium
RRunPod
Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
That works, thanks!
19 replies
RRunPod
Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
request:
{
"input": {
"method_name": "generate",
"input": {
"prompt": "why the sky is blue?"
}
}
}
{
"input": {
"method_name": "generate",
"input": {
"prompt": "why the sky is blue?"
}
}
}
19 replies
RRunPod
Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
Yeah:
{
"delayTime": 117699,
"error": "{\"error_type\": \"<class 'requests.exceptions.JSONDecodeError'>\", \"error_message\": \"Extra data: line 1 column 5 (char 4)\", \"error_traceback\": \"Traceback (most recent call last):\\n File \\\"/usr/local/lib/python3.10/dist-packages/requests/models.py\\\", line 974, in json\\n return complexjson.loads(self.text, **kwargs)\\n File \\\"/usr/lib/python3.10/json/__init__.py\\\", line 346, in loads\\n return _default_decoder.decode(s)\\n File \\\"/usr/lib/python3.10/json/decoder.py\\\", line 340, in decode\\n raise JSONDecodeError(\\\"Extra data\\\", s, end)\\njson.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4)\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \\\"/usr/local/lib/python3.10/dist-packages/runpod/serverless/modules/rp_job.py\\\", line 134, in run_job\\n handler_return = handler(job)\\n File \\\"//runpod_wrapper.py\\\", line 39, in handler\\n return response.json()\\n File \\\"/usr/local/lib/python3.10/dist-packages/requests/models.py\\\", line 978, in json\\n raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)\\nrequests.exceptions.JSONDecodeError: Extra data: line 1 column 5 (char 4)\\n\", \"hostname\": \"ogp9bh9fndvgck-64411159\", \"worker_id\": \"ogp9bh9fndvgck\", \"runpod_version\": \"1.6.2\"}",
"executionTime": 61,
"id": "c4794910-58f5-4179-98a9-0b0779ba0749-u1",
"status": "FAILED"
}
{
"delayTime": 117699,
"error": "{\"error_type\": \"<class 'requests.exceptions.JSONDecodeError'>\", \"error_message\": \"Extra data: line 1 column 5 (char 4)\", \"error_traceback\": \"Traceback (most recent call last):\\n File \\\"/usr/local/lib/python3.10/dist-packages/requests/models.py\\\", line 974, in json\\n return complexjson.loads(self.text, **kwargs)\\n File \\\"/usr/lib/python3.10/json/__init__.py\\\", line 346, in loads\\n return _default_decoder.decode(s)\\n File \\\"/usr/lib/python3.10/json/decoder.py\\\", line 340, in decode\\n raise JSONDecodeError(\\\"Extra data\\\", s, end)\\njson.decoder.JSONDecodeError: Extra data: line 1 column 5 (char 4)\\n\\nDuring handling of the above exception, another exception occurred:\\n\\nTraceback (most recent call last):\\n File \\\"/usr/local/lib/python3.10/dist-packages/runpod/serverless/modules/rp_job.py\\\", line 134, in run_job\\n handler_return = handler(job)\\n File \\\"//runpod_wrapper.py\\\", line 39, in handler\\n return response.json()\\n File \\\"/usr/local/lib/python3.10/dist-packages/requests/models.py\\\", line 978, in json\\n raise RequestsJSONDecodeError(e.msg, e.doc, e.pos)\\nrequests.exceptions.JSONDecodeError: Extra data: line 1 column 5 (char 4)\\n\", \"hostname\": \"ogp9bh9fndvgck-64411159\", \"worker_id\": \"ogp9bh9fndvgck\", \"runpod_version\": \"1.6.2\"}",
"executionTime": 61,
"id": "c4794910-58f5-4179-98a9-0b0779ba0749-u1",
"status": "FAILED"
}
19 replies
RRunPod
Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
I keep getting JSON decoding errors trying to run queries on it...
19 replies
RRunPod
Created by NERDDISCO on 7/25/2024 in #⚡|serverless
Llama 3.1 via Ollama
When you say "In the Container Start Command field, specify the Ollama supported model", do you mean literally just pasting the ollama model ID into that field?
19 replies