Local LLM for Code

Hey all, I am planning on getting a 5090 for gaming but would love if this could double as a tool for running LLMs locally. In this fast changing LLM world, can anybody recommend some local models that would pair nicely with Roo Code or similar VS Code extensions? Primarily focusing on JS/React. I plan on using LM Studio or Ollama but unsure which models to use. FYI the 5090 has ~30GB of usable VRAM. Thanks 🙂
0 Replies
No replies yetBe the first to reply to this messageJoin

Did you find this page helpful?