Trystan
Trystan
TTCTheo's Typesafe Cult
Created by Trystan on 3/29/2025 in #questions
Local LLM for Code
Hey all, I am planning on getting a 5090 for gaming but would love if this could double as a tool for running LLMs locally. In this fast changing LLM world, can anybody recommend some local models that would pair nicely with Roo Code or similar VS Code extensions? Primarily focusing on JS/React. I plan on using LM Studio or Ollama but unsure which models to use. FYI the 5090 has ~30GB of usable VRAM. Thanks 🙂
2 replies