Llama3-Chiese-8B基于Llama3-8B的中文对话模型,由Llama中文社区和AtomEcho(原子回声)联合研发,我们会持续提供更新的模型参数,模型训练过程见(https://llama.family)。 模型的部署、训练、微调等方法详见Llama中文社区GitHub仓库:https://github.com/LlamaFamily/Llama-Chiese https://llama.family/chat/#/ 下载模型 使用Cloe with HTTP
git cloe https://www.modelscope.c/FlagAlpha/Llama3-Chiese-8B-Istruct.git
Llama3-Chiese-8B
在线体验
如何使用
git cloe https://www.modelscope.c/FlagAlpha/Llama3-Chiese-8B-Istruct.git
import trasformers
import torch
model_id = "./Llama3-Chiese-8B-Istruct"
pipelie = trasformers.pipelie(
"text-geeratio",
model=model_id,
model_kwargs={"torch_dtype": torch.float16},
device="cuda",
)
messages = [{"role": "system", "cotet": ""}]
messages.apped(
{"role": "user", "cotet": "介绍一下机器学习"}
)
prompt = pipelie.tokeizer.apply_chat_template(
messages,
tokeize=False,
add_geeratio_prompt=True
)
termiators = [
pipelie.tokeizer.eos_toke_id,
pipelie.tokeizer.covert_tokes_to_ids("<|eot_id|>")
]
outputs = pipelie(
prompt,
max_ew_tokes=512,
eos_toke_id=termiators,
do_sample=True,
temperature=0.6,
top_p=0.9
)
cotet = outputs[0]["geerated_text"][le(prompt):]
prit(cotet)
点击空白处退出提示
评论