R
RunPod9mo ago
Concept

VLLM Error

2024-02-28T21:49:45.485567449Z The above exception was the direct cause of the following exception: 2024-02-28T21:49:45.485572406Z 2024-02-28T21:49:45.485576486Z Traceback (most recent call last): 2024-02-28T21:49:45.485580679Z File "/handler.py", line 8, in <module> 2024-02-28T21:49:45.485636156Z vllm_engine = vLLMEngine() 2024-02-28T21:49:45.485673099Z ^^^^^^^^^^^^ 2024-02-28T21:49:45.485677772Z File "/engine.py", line 37, in init 2024-02-28T21:49:45.485766458Z self.tokenizer = Tokenizer(self.config["model"]) 2024-02-28T21:49:45.485851807Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-02-28T21:49:45.485869204Z File "/engine.py", line 13, in init 2024-02-28T21:49:45.485930650Z self.tokenizer = AutoTokenizer.from_pretrained(model_name) 2024-02-28T21:49:45.486024250Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-02-28T21:49:45.486057129Z File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/tokenization_auto.py", line 752, in from_pretrained 2024-02-28T21:49:45.486279358Z config = AutoConfig.from_pretrained( 2024-02-28T21:49:45.486317144Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-02-28T21:49:45.486355404Z File "/usr/local/lib/python3.11/dist-packages/transformers/models/auto/configuration_auto.py", line 1082, in from_pretrained 2024-02-28T21:49:45.486897074Z config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, kwargs) 2024-02-28T21:49:45.486931567Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-02-28T21:49:45.486950020Z File "/usr/local/lib/python3.11/dist-packages/transformers/configuration_utils.py", line 644, in get_config_dict 2024-02-28T21:49:45.487400710Z config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, kwargs) 2024-02-28T21:49:45.487416173Z ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ 2024-02-28T21:49:45.487420923Z File "/usr/local/lib/python3.11/dist-packages/transformers/configuration_utils.py", line 699, in _get_config_dict 2024-02-28T21:49:45.487425980Z resolved_config_file = cached_file( 2024-02-28T21:49:45.487455286Z ^^^^^^^^^^^^ 2024-02-28T21:49:45.487497700Z File "/usr/local/lib/python3.11/dist-packages/transformers/utils/hub.py", line 429, in cached_file 2024-02-28T21:49:45.487650365Z raise EnvironmentError( 2024-02-28T21:49:45.487658992Z OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like mistralai/Mistral-7B-Instruct-v0.1 is not the path to a directory containing a file named config.json. 2024-02-28T21:49:45.487664865Z Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode' anyone else having this with VLLM worker?
2 Replies
ashleyk
ashleyk9mo ago
https://huggingface.co/ - 503 its in maintenance
Concept
ConceptOP9mo ago
thank you :)
Want results from more Discord servers?
Add your server