lucasavila00
Explore posts from serversRRunPod
•Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
It is very weird because logs work, so I can print "about to call the LammaCpp constructor" and it shows this message in the logs in the UI. But it doesn't show the error, is just shows the NVIDIA
CUDA Version 11.8.0
etc which it shows when the docker image starts.9 replies
RRunPod
•Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
To me it feels like a bug in serverless UI, it can't report logs if the python process crashes, it seems.
I did not try to report this error with another process, the docker command is
CMD python3.11 -u /handler.py
9 replies
RRunPod
•Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
I can fix it by downgrading https://github.com/abetlen/llama-cpp-python/releases to v0.2.23
The path etc work correctly, I'm testing it locally with nvidia docker too.
9 replies
RRunPod
•Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
I have error logging, but it shows nothing.
It prints the model path, and restarts.
9 replies
RRunPod
•Created by lucasavila00 on 1/7/2024 in #⚡|serverless
Restarting without error message
I have a custom template that can reproduce the issue. I deleted the broken workers and logs.
9 replies