MD300
Pod / GPU stopped working
It's just sad to see a company with an in-demand servce, that's not cheap, not putting more effort into directly solving paying-client issues.
And for a datacentre to have issues for a day or more, that's crazy. Can't be a real data centre. I came from the web dev world, where issues like this were resolved in minutes, or at least hours.
50 replies
Pod / GPU stopped working
that would explain a lot, and noticed it from yesterday. That is frustring. Thanks for highlighting this. Good to know.
RunPod should be refunding on some the charges when it's per hour, and clearly people paying for something that doesn't work
50 replies
Pod / GPU stopped working
I tried that. Each thing I do or try sems to give me another error. Ollam won't stay running. Restarting now starts without Ollama running (which it did't do before).
I guess pod is lost. Very annoying as I'm now behind on a deadline, and setting this all up again is a bit crazy.
Trying Tensordock and Vast to see if I can get something there around similar price (if I have to buy a new pod and set up again, with no support here to look into pod, makes no sense to stay if alternative). Spent $100s on this pod as it's been on 24/7
50 replies
Pod / GPU stopped working
So seems I cannot now do anything. Just tried to pull a model (same one I pulled before, from huggingface), and now get this:
Error: max retries exceeded: write /workspace/blobs/sha256-a2a70f064765a8161ab76dc053b07acae14bc14056c0852550c1608724d60815-partial: input/output error
Pod is empty (nothing else installed).
Do you think I have a hope here of continuing with this pod?
50 replies
Pod / GPU stopped working
Re. web termanal, I've slick Start, there was animation, but it wasn't showing the connect option. It is now.
I've been using it with no browser restrictions or blockers (one thing I do is SaaS consultantcy, so always turn that studd off for dashboards, as can br problematic)
50 replies
Pod / GPU stopped working
Re. the issue, with the correct install command it worked, and I have a line in the results that I've not seen before:
'NVIDIA GPU installed.'
Is it possible that somehow yesterday GPU software / driver stopped or got corrupted (I wouldn't know how to uninstall it)?
Or is this not an unusual message?
50 replies
Pod / GPU stopped working
Re, support, not had a reply yet after two attempts (that was last week). Tried a third time just before writing here. But in fairness, slthough you and a couple of others have been great help, you can see RunPod have pushed support to the community, with no real intent to do much more (it's not an unusual approach, but very odd for a paid service that amounts to $100s a month).
It's ONLY this channel and a couple of you that meant I stayed. But I actively look for alternatives every day.
Wearing a different hat of mine, RunPod are missing a huge opportunity to be top of the pile, give the plus sides of what they have.
50 replies
Pod / GPU stopped working
Thanks for noticing the missing 'h'. It actually failed with the correct command, but not with that error (I didn't copy it at the time). I'll come back on that one.
So on restart, for about 20 minutes after I couldn't open the web terminal. That seems to work now.
The main issue started sometime yesterday late morning (my timezone), suddenly models got very slow (Ollama run <modelname> was timing out, which I had not seen before). That's when I first restarted, reinstalled Ollama (which worked then), but still very slow.
50 replies
Pod / GPU stopped working
It may or may not be related, but I also had this:
curl https://ollama.ai/install.sh | s
bash: s: command not found
So the Ollama update was failing. It had worked previously.
Do you think the pod is fixable, and if so, who would I ask (I've never managed to get a response from support)
50 replies
Pod / GPU stopped working
Ok, thanks.
Now the pod won't restart. So it seems something was wrong with the pod since yesterday after (when I started to have issues with Ollama, and everything was much slower than normal).
Purely out of principal, do you know if we have any method to claim refunds on time paid when a pod was functioning correctly?
50 replies