lucasavila00
lucasavila00
Explore posts from servers
RRunPod
Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
It is very weird because logs work, so I can print "about to call the LammaCpp constructor" and it shows this message in the logs in the UI. But it doesn't show the error, is just shows the NVIDIA CUDA Version 11.8.0 etc which it shows when the docker image starts.
9 replies
RRunPod
Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
To me it feels like a bug in serverless UI, it can't report logs if the python process crashes, it seems. I did not try to report this error with another process, the docker command is CMD python3.11 -u /handler.py
9 replies
RRunPod
Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
I can fix it by downgrading https://github.com/abetlen/llama-cpp-python/releases to v0.2.23 The path etc work correctly, I'm testing it locally with nvidia docker too.
9 replies
RRunPod
Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
I have error logging, but it shows nothing. It prints the model path, and restarts.
llama2 = None
try:
if not IS_STUB:
with open("path.txt", "r") as f:
model_path = f.read()
print(model_path) # prints up to here
llama2 = models.LlamaCpp(
model_path, n_gpu_layers=-1, n_ctx=8192, echo=False
)
except Exception as e:
print(e)
print("failed to load model")
# sleep for 5s
time.sleep(5)

raise e
print("loaded model")
llama2 = None
try:
if not IS_STUB:
with open("path.txt", "r") as f:
model_path = f.read()
print(model_path) # prints up to here
llama2 = models.LlamaCpp(
model_path, n_gpu_layers=-1, n_ctx=8192, echo=False
)
except Exception as e:
print(e)
print("failed to load model")
# sleep for 5s
time.sleep(5)

raise e
print("loaded model")
9 replies
RRunPod
Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
I have a custom template that can reproduce the issue. I deleted the broken workers and logs.
9 replies