My_model_compare

我要开发同款
匿名用户2024年07月31日
18阅读
所属分类ai、other
开源地址https://modelscope.cn/models/neala668/My_model_compare

作品详情

本仓库是进行语言大模型比较的作业

智谱

参考: https://github.com/THUDM/ChatGLM3

from modelscope import AutoTokenizer, AutoModel, snapshot_download
model_dir = snapshot_download("ZhipuAI/chatglm3-6b-32k", revision = "v1.0.0")
tokenizer = AutoTokenizer.from_pretrained(model_dir, trust_remote_code=True)
model = AutoModel.from_pretrained(model_dir, trust_remote_code=True).half().cuda()
model = model.eval()
response, history = model.chat(tokenizer, "你好", history=[])
print(response)
response, history = model.chat(tokenizer, "冬天:能穿多少穿多少 2、夏天:能穿多少穿多少", history=history)
print(response)

通义千问

参考: https://github.com/QwenLM/Qwen

from modelscope import AutoModelForCausalLM, AutoTokenizer
from modelscope import GenerationConfig
tokenizer = AutoTokenizer.from_pretrained("qwen/Qwen-7B-Chat", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretrained("qwen/Qwen-7B-Chat", device_map="auto", trust_remote_code=True).eval()
response, history = model.chat(tokenizer, "你好", history=None)
print(response)
response, history = model.chat(tokenizer, ".明明明明明白白白喜欢他,可她就是不说。 这句话里,明明和白白谁喜欢谁?", history=history)
print(response)

百川

参考: https://modelscope.cn/models/baichuan-inc/baichuan-7B/summary

import torch
from modelscope import snapshot_download, AutoModelForCausalLM, AutoTokenizer,GenerationConfig
model_dir = snapshot_download("baichuan-inc/Baichuan2-7B-Chat", revision='v1.0.5')
tokenizer = AutoTokenizer.from_pretrained(model_dir, device_map="auto", 
                              trust_remote_code=True, torch_dtype=torch.float16)
model = AutoModelForCausalLM.from_pretrained(model_dir, device_map="auto", 
                              trust_remote_code=True, torch_dtype=torch.float16)
model.generation_config = GenerationConfig.from_pretrained(model_dir)
messages = []
messages.append({"role": "user", "content": "你好"})
response = model.chat(tokenizer, messages)
print(response)
messages.append({'role': 'assistant', 'content': response})
messages.append({"role": "user", "content": "他知道我知道你知道他不知道吗? 这句话里,到底谁不知道"})
response = model.chat(tokenizer, messages)
print(response)

Git下载

git clone https://www.modelscope.cn/neala668/My_model_compare.git
声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论