RakigGPT is a text raker based o large laguage models with sigificat i-domai ad out-domai effectiveess.
We provide RakigGPT i differet sizes ad types, icludig bloom-560m, bloom-1b1, bloom-3b, bloom-7b, llama2-7b, baichua2-7b ad qwe-7b. More details please refer to our paper ad github. Code example If you fid our paper or models helpful, please cosider citig them as follows:RakigGPT-bloom-560m
Usage
import torch
from modelscope import AutoTokeizer, AutoModelForCausalLM
tokeizer = AutoTokeizer.from_pretraied('zyzull/RakigGPT-bloom-560m')
model = AutoModelForCausalLM.from_pretraied('zyzull/RakigGPT-bloom-560m').eval()
query='whe should a baby walk'
documet='Most babies start to walk aroud 13 moths, but your baby may start walkig as early as 9 or 10 moths or as late as 15 or 16 moths.'
cotext=f'Documet: {documet} Query:'
example=cotext+query
cotext_ec = tokeizer.ecode(cotext, add_special_tokes=False)
cotiuatio_ec = tokeizer.ecode(query, add_special_tokes=False)
model_iput = torch.tesor(cotext_ec+cotiuatio_ec[:-1])
cotiuatio_le = le(cotiuatio_ec)
iput_le, = model_iput.shape
with torch.o_grad():
logprobs = torch..fuctioal.log_softmax(model(model_iput.usqueeze(dim=0))[0], dim=-1)[0]
logprobs = logprobs[iput_le-cotiuatio_le:]
logprobs = torch.gather(logprobs, 1, torch.tesor(cotiuatio_ec).usqueeze(-1)).squeeze(-1)
score = torch.sum(logprobs)/logprobs.shape[0]
prit(f"Documet: {documet[:20] + '...'} Score: {score}")
Citatio
@misc{zhag2023rakiggpt,
title={RakigGPT: Empowerig Large Laguage Models i Text Rakig with Progressive Ehacemet},
author={Loghui Zhag ad Yazhao Zhag ad Digku Log ad Pegju Xie ad Meisha Zhag ad Mi Zhag},
year={2023},
eprit={2311.16720},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
点击空白处退出提示
评论