http service [port 7860] Not Ready
I broke runpod. NEW to LLM. Use runpod webui/terminal and huggingface. I receive this error regardless of GPU. I am using straight forward bloke 1 click ui. Its worked for weeks. Then recently - no. So this is EsinError/operator error. That is my error when I try to start the terminal.
I see this in Logs: AttributeError: module 'gradio.layouts' has no attribute 'all'
I see this in Logs: AttributeError: module 'gradio.layouts' has no attribute 'all'
8 Replies
AttributeError: module 'gradio.layouts' has no attribute 'all'
Maybe have you installed the packages?
seems like that piece of import isnt valid
or cant be resolved by the python
wrong package import or uninstalled
I have never intentionally done anything. When I reboot (i9 win11 if it matters). When delete the pod reboot and come back and recreate pod (rinse repeat) this error persists. I've tried other templates too. I might have accidentally changed something in my terminal. I did try and fail to create SSH.. But this should not persist when I delete and recreate pod right? What do I do next?
Hmm what template did you use then
Are there any network storage?
Maybe the templates needs net storage
Ssh isn't available in all templates btw
Yeah it should not persists unless the template isn't right or the usage isn't right
@nerdylive it’s a broken bloke template that has not been updated for very long time
Ah there thanks
https://runpod.io/console/gpu-cloud?template=00y0qvimn6&ref=2vdt3dn9
Fork of the Bloke but updated
This one is always updated and tested before tagging a new release so it is the most stable and always up to date template for oobabooga. It is even more stable than TheBloke's was before he dropped off the face of the earth.
It also doesn't do any auto updates like TheBloke etc which constantly breaks things because oobabooga main branch is not always stable.