LoRA path in vLLM serverless template

I want to attach a custom LoRA adapter to Llama-3.1-70B model. Usually while using vLLM, after the --enable-lora we also specify the --lora-modules name=lora_adapter_path, something like this. But in the template, it only gives option to enable LoRA, where do I add the path of the LoRA adapter?
0 Replies
No replies yetBe the first to reply to this messageJoin
Want results from more Discord servers?
Add your server