f4242
f4242
RRunPod
Created by f4242 on 3/14/2024 in #⛅|pods
Create new pod with runpodctl
I'm trying to create a pod with runpodctl. It appears by reading the --help that I cannot create a pod using network storage for /workspace? I didn't find the correct option to pass. Maybe with --args ? Bonus point: how can I create a pod with specific requirements? Eg: Start a pod with 48 GB of VRAM with cost less than $1/hr. It could start a pod with 2xA5000 or 1xA6000 depending of available resources.
13 replies
RRunPod
Created by f4242 on 3/13/2024 in #⛅|pods
Keeping reverse proxy hostname between destroy/start
Hello, I'm using network storage for my pod. My use case doesn't require the container to be up 24/7. I noticed there was no stop button on the web GUI but I was able to start/stop container with the API. So I did. I think this is what could cause my pod to run without GPU attached. I found that I can only start the pod without GPU on the web GUI this morning. I was trying to stop container instead of destroying it because I want to keep the same container id, so my reverse http proxy hostname doesn't change each time. Is there a workaround? Thanks.
6 replies
RRunPod
Created by f4242 on 3/11/2024 in #⛅|pods
No GPU, RO RTX4090 node
Hi, seem like there is an issue with RTX4090 Romanian node. Seem like there is no GPU attached while I pay the regular price (not cpu-only). Maybe not related, but there is something that prevent me to start it with 2 GPU too. nvidia-smi command returns "Failed to initialize NVML: Driver/library version mismatch" and llama.cpp says "ggml_init_cublas: no CUDA devices found, CUDA will be disabled". No issue with RTX4000 Romanian nodes.
8 replies