Is there an existing issue for this?
- I have searched the existing issues
Current Behavior
通过mps后端加载
tokenizer = AutoTokenizer.from_pretrained(“/本地路径示例/chatglm-6b”, trust_remote_code=True)
model = AutoModel.from_pretrained(“/本地路径示例/chatglm-6b”, trust_remote_code=True).half().to(‘mps’)
提示:UserWarning: MPS: no support for int64 min/max ops, casting it to int32
然后系统就一直卡住了,没有输出
Expected Behavior
No response
Steps To Reproduce
111
Environment
- OS:MacOS Ventura
- Python:3.10
- Transformers:
- PyTorch:
- CUDA Support (`python -c "import torch; print(torch.cuda.is_available())"`) :False