my pod start very slow
it takes 10 minutes for my port 5000 to go to ready , any helep pls ?
42 Replies
System logs are pretty useless, what do container logs say? Was it perhaps downloading a model?
i use koboldai/koboldai:united officiel
its say error i dont know what is it
Probably the installation of all this stuff that causes you to have to wait for the port to come up.
Some issue updating cloudflared, you probably have to contact the author of the template about this.
i think its instal "werkzeug" over and over
This is just a conflict in dependencies between packages and is normal behavior.
ok, do you know where i can find so help for the koboldai officiel template ?
I also thought it might be my ports misconfigured or my fire wall, is that possible?
Pods don't usually have a firewall.
@jibay I guess the first question is what do you mean koboldai official template?
All RunPod templates are "official" templates.
Ohh
didnt' see this one interesting
I have never used the Kobold AI templates, so don't know what they do
If it's not runpod/templatename @jibay it isn't one by runpod. So the best person to ask is the koboldai folks who makes the template, as it's a community template / made externally.
Your second best bet is to start with a pytorch template from runpod, and just go through an installation step manually for whatever you want to use, see if you can get it working, and then build a docker file for yourself
He is using the RunPod one.
Is it? huh.
His first screenshot shows the same docker image
Is kobald run by runpod? interesting, i guess i don't know the difference between runpod / community then. hm.
I was thinking maybe just not actively maintained
Its community, RunPod make some community things official like Kobold and TheBloke
Ohhh. Interesting. thanks I didn't know that š®
Good to know. Always learning something new
Zeen also discussed possibly making some of my templates official with me a few months back
Ur stable diffusion one would be awesome lol. Okok. Good to know then. This seems like a runpod issue maybe š¤. or just somethign went wrong with dependencies recently.
Yeah, its #1 most popular community template on RunPod with almost 20 years of runtime on RunPod and almost 20,000 docker pulls š
Probably the dev needs to look at it, but I don't know who the dev is, maybe @Madiator2011 will know
Sorry for this template do not know
I also not see it in our repo https://github.com/runpod/containers/tree/main/official-templates
GitHub
containers/official-templates at main Ā· runpod/containers
š³ | Dockerfiles for the RunPod container images used for our official templates. - runpod/containers
Maybe consider deprecating it from your guys official template then. cause it feels like you would be giving it active support. Maybe top community templates should be a section instead, or a tag.
Looks like @Henky!! is the dev:
https://discord.com/channels/912829806415085598/949023433960857610/1103819303016472586
And here is the repo:
https://github.com/0cc4m/koboldai
GitHub
GitHub - 0cc4m/KoboldAI
Contribute to 0cc4m/KoboldAI development by creating an account on GitHub.
wow nice~
hi justin. yes i use the runpod Kobold AI United from the templates official section. by the way i to I tried to deploy the pytorch pod, its work perfectly , my port become "ready" instant.
My guess is that this template has fallen out of support by the original developer / something went wrong. I looked at the github repo:
https://github.com/0cc4m/koboldai
And if we follow the git submodules:
The repository it was submoduling, has been put to public archive:
https://github.com/db0/KoboldAI-Horde-Bridge/tree/7a7327804ff10182adf8cda48e97784958699a49
Not sure why this would break a docker container... usually something tends to always work if it gets built into an image
But your best bet might be to look else where:
https://github.com/bangkokpadang/KoboldAI-Runpod
https://www.reddit.com/r/KoboldAI/comments/14lxcyh/a_quick_and_easy_way_to_run_koboldai_andor/
Some quick googling, seems like this guy bangkokpadang, tho it isn't a one click set up has something along the lines can look into
ty for you help , justin and ashleyk
I did generate a new version yesterday, did I break something?
The reason United is a bit slower is because its self updating on boot
We had an old torch in there as I was having some docker build issues the past weeks but to my knowledge its more reasonable now thats solved. Still expect a minute or two update time as it checks everything.
Thay repo isn't the repo, thats just an abandoned build of a dev who already upstreamed his work
The real repo united uses is https://github.com/henk717/koboldai
GitHub
GitHub - henk717/KoboldAI
Contribute to henk717/KoboldAI development by creating an account on GitHub.
Looking at the error I already know what that was
It was stuck on trying to start a cloudflare tunnel because yesterday cloudflare released a new version and their CI always takes a few hours to release the new binary in the place we expect. So should have solved itself by now.
@justin We do still actively maintain it so if people have issues feel free to forward to https://koboldai.org/discord where more people can help in a support channel. Otherwise you have to hope I see the notification in time.
Discord
Join the KoboldAI Discord Server!
This community is dedicated to the usage and development of KoboldAI's software, as well as broader text generation AI. | 10651 members
Also for your info the runpod template is supplied by our official docker so I do maintain it instead of runpod
wow great to know! Thanks!!
ā¤ļø . ā¤ļø
If this specific cloudflare issue happens again the user can remove --remote from the args to stop the CF attempt. They luckily dont release binaries all that often but when they do the issue will be back for a few hours
I'm really not good at this kind of thing, it's hard to do?
Nah, the easiest method is waiting two hours but when you start your runpod you have the ability to customize the template.
On that screen find --remote and remove it. I just looked and cloudflare undid the update. So may happen again soon when they publish again.
ty for your help
Feel free to join our own discord to hang out with other Kobold users and for faster support š