A
A
RRunPod
Created by jd24 on 4/23/2024 in #⚡|serverless
How does the vLLM serverless worker to support OpenAI API contract?
I'm having issues getting this to work it says |openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Incorrect API key provided: WCFGLZ0M**L96G. You can find your API key at https://platform.openai.com/account/api-keys.', 'type': 'invalid_request_error', 'param': None, 'code': 'invalid_api_key'}| any help would be greatly appeciated. thank you.
12 replies