manan4884
manan4884
RRunPod
Created by manan4884 on 2/15/2024 in #⛅|pods
Ollama API
Hello, I am trying to host LLMs on runpod gpu-cloud using Ollama (https://ollama.com/download). I want to set it up as an endpoint so I can access it from my local laptop, using Python libraries like Langchain. I'm having trouble setting up the API endpoint, anyone worked with this before?
23 replies