R
RunPod12mo ago
nerdylive

Hello, i think my template downloaded the docker template image while running my request

i've deleted the worker because its still running and i cancelled the request but heres my endpoint id: 489pa1sglkvuhf when i realize it was downloading the docker image instead of my model it was too late.
5 Replies
nerdylive
nerdyliveOP12mo ago
ples help i've got charged for downloading the docker image for now, i've created the new worker and it worked Nvm it's not working yet but still initializing
ashleyk
ashleyk12mo ago
You don't get charged for downloading the docker images in serverless. Sounds like there is a problem with your docker image if it stays in initializing state, it should go to idle.
nerdylive
nerdyliveOP12mo ago
really im using runpod's vllm worker ( runpod/worker-vllm:dev ) the logs shoows the same thing as downloading docker image maybe it loaded the old from cache then be idle then started downloading the newer when running see
2023-12-22T11:16:18.777584427Z (RayWorkerVllm pid=2031)
output-00001-of-00003.safetensors: 0%| | 0.00/8.59G [00:00<?, ?B/s]
2023-12-22T11:16:41.937548928Z (RayWorkerVllm pid=2031)
output-00001-of-00003.safetensors: 0%| | 10.5M/8.59G [00:23<5:15:49, 453kB/s]
2023-12-22T11:16:41.937602560Z (pid=2032) /usr/local/lib/python3.11/dist-packages/transformers/utils/hub.py:123: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.
2023-12-22T11:16:41.937609294Z (pid=2032) warnings.warn(
2023-12-22T11:16:51.681805420Z (RayWorkerVllm pid=2031)
output-00001-of-00003.safetensors: 0%| | 21.0M/8.59G [00:32<3:28:07, 686kB/s]
2023-12-22T11:16:18.777584427Z (RayWorkerVllm pid=2031)
output-00001-of-00003.safetensors: 0%| | 0.00/8.59G [00:00<?, ?B/s]
2023-12-22T11:16:41.937548928Z (RayWorkerVllm pid=2031)
output-00001-of-00003.safetensors: 0%| | 10.5M/8.59G [00:23<5:15:49, 453kB/s]
2023-12-22T11:16:41.937602560Z (pid=2032) /usr/local/lib/python3.11/dist-packages/transformers/utils/hub.py:123: FutureWarning: Using `TRANSFORMERS_CACHE` is deprecated and will be removed in v5 of Transformers. Use `HF_HOME` instead.
2023-12-22T11:16:41.937609294Z (pid=2032) warnings.warn(
2023-12-22T11:16:51.681805420Z (RayWorkerVllm pid=2031)
output-00001-of-00003.safetensors: 0%| | 21.0M/8.59G [00:32<3:28:07, 686kB/s]
my model hasnt even been downloaded it was downloading the image
flash-singh
flash-singh12mo ago
did you push a new change to the same tag? make sure to use new tags for all changes
nerdylive
nerdyliveOP12mo ago
nono i'm using the original one tag dev if im not wrong and didnt push any changes im using it from the docker hub runpod's image
Want results from more Discord servers?
Add your server