深度学习之本地部署大模型ChatGLM3-6B【大模型】【报错】

时间:2025-05-06 08:15:43
The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/flyvideo/mySata/zzx/ChatGLM3/", line 2, in <module> tokenizer = AutoTokenizer.from_pretrained("./model/chatglm3-6b", trust_remote_code=True) File "/home/flyvideo/anaconda3/envs/toolbench/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 719, in from_pretrained tokenizer_class = get_class_from_dynamic_module(class_ref, pretrained_model_name_or_path, **kwargs) File "/home/flyvideo/anaconda3/envs/toolbench/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 485, in get_class_from_dynamic_module final_module = get_cached_module_file( File "/home/flyvideo/anaconda3/envs/toolbench/lib/python3.10/site-packages/transformers/dynamic_module_utils.py", line 292, in get_cached_module_file resolved_module_file = cached_file( File "/home/flyvideo/anaconda3/envs/toolbench/lib/python3.10/site-packages/transformers/utils/", line 469, in cached_file raise EnvironmentError( OSError: We couldn't connect to '' to load this file, couldn't find it in the cached files and it looks like THUDM/chatglm3-6b is not the path to a directory containing a file named tokenization_chatglm.py. Checkout your internet connection or see how to run the library in offline mode at '/docs/transformers/installation#offline-mode'