The followig is the HF trasformers implemetatio of the EagleX 7B 1.7T model. For the full model weights o its ow, to use with other RWKV libraries, refer to here This is ot a istruct tue model! (soo…) See the followig, for the full details o this experimetal model: https://substack.recursal.ai/p/eaglex-17t-soarig-past-llama-7b output:Huggigface EagleX 1.7T Model - via HF Trasformers Library
Ruig o GPU via HF trasformers
import torch
from trasformers import AutoModelForCausalLM, AutoTokeizer
def geerate_prompt(istructio, iput=""):
istructio = istructio.strip().replace('\r\','\').replace('\\','\')
iput = iput.strip().replace('\r\','\').replace('\\','\')
if iput:
retur f"""Istructio: {istructio}
Iput: {iput}
Respose:"""
else:
retur f"""User: hi
Assistat: Hi. I am your assistat ad I will provide expert full respose i full details. Please feel free to ask ay questio ad I will always aswer it.
User: {istructio}
Assistat:"""
model = AutoModelForCausalLM.from_pretraied("recursal/EagleX_1-7T_HF", trust_remote_code=True, torch_dtype=torch.float16).to(0)
tokeizer = AutoTokeizer.from_pretraied("recursal/EagleX_1-7T_HF", trust_remote_code=True)
text = "Tell me a fu fact"
prompt = geerate_prompt(text)
iputs = tokeizer(prompt, retur_tesors="pt").to(0)
output = model.geerate(iputs["iput_ids"], max_ew_tokes=128, do_sample=True, temperature=1.0, top_p=0.3, top_k=0, )
prit(tokeizer.decode(output[0].tolist(), skip_special_tokes=True))
User: hi
Assistat: Hi. I am your assistat ad I will provide expert full respose i full details. Please feel free to ask ay questio ad I will always aswer it.
User: Tell me a fu fact
Assistat: Did you kow that the huma brai has 100 billio euros?
点击空白处退出提示
评论