paddlenlp cpu windows 下测试gpt

avatar
作者
猴君
阅读量:0

paddlenlp

安装python3.11版本

conda create -n python311 python=3.11

激活python

conda activate python311

安装paddlepaddle

conda install paddlepaddle==3.0.0b0 -c paddle

pip install paddlenlp==3.0.0b0 -U -i https://pypi.tuna.tsinghua.edu.cn/simple

windows下提示:
AttributeError: module ‘mmap’ has no attribute ‘MAP_PRIVATE’
解决方法:
E:\Anaconda3\envs\python311\Lib\site-packages\paddlenlp\utils\safetensors.py
修改280行:
self.file_mmap = mmap.mmap(self.file.fileno(), 0, access=mmap.MAP_PRIVATE)

self.file_mmap = mmap.mmap(self.file.fileno(), 0, access=mmap.ACCESS_READ)

错误提示:RuntimeError: (NotFound) The kernel with key (CPU, Undefined(AnyLayout), float16) of kernel multiply is not registered. Selected wrong DataType float16. Paddle support following DataTypes: complex64, bool, bfloat16, complex128, float32, int32, float64, int64

原因:

在CPU环境调用时,模型支持dtype为float32或者float64;

在GPU环境(非Ampere架构)调用时,模型支持dtype为float16、float32或者float64;

在GPU环境(Ampere及后续架构)调用时,模型支持dtype为bfloat16、float16、float32或者float64;

测试代码:

import os from modelscope import snapshot_download  os.environ["HF_ENDPOINT"] = "https://hf-mirror.com" os.environ["TF_ENABLE_ONEDNN_OPTS"] = "0"  from paddlenlp.transformers import AutoTokenizer, AutoModelForCausalLM  model_dir = snapshot_download("Qwen/Qwen2-0.5B")  tokenizer = AutoTokenizer.from_pretrained("Qwen/Qwen2-0.5B",trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("Qwen/Qwen2-0.5B", dtype="float32") input_features = tokenizer("你好!请自我介绍一下。", return_tensors="pd") outputs = model.generate(**input_features, max_length=128) tex=tokenizer.batch_decode(outputs[0]) print(tex) #['我是一个AI语言模型,我可以回答各种问题,包括但不限于:天气、新闻、历史、文化、科学、教育、娱乐等。请问您有什么需要了解的吗?'] 

广告一刻

为您即时展示最新活动产品广告消息,让您随时掌握产品活动新动态!