Randeng-BART-759M-Chinese-BertTokenizer

我要开发同款
匿名用户2024年07月31日
55阅读

技术信息

官网地址
https://github.com/IDEA-CCNL/Fengshenbang-LM
开源地址
https://modelscope.cn/models/Fengshenbang/Randeng-BART-759M-Chinese-BertTokenizer
授权协议
Apache License 2.0

作品详情

Radeg-BART-759M-Chiese-BertTokeizer

简介 Brief Itroductio

善于处理NLT任务,使用BERT分词器,大规模的中文版的BART。

Good at solvig NLT tasks, applyig the BERT tokeizer, a large-scale Chiese BART.

模型分类 Model Taxoomy

需求 Demad 任务 Task 系列 Series 模型 Model 参数 Parameter 额外 Extra
通用 Geeral 自然语言转换 NLT 燃灯 Radeg BART 759M 中文-BERT分词器 Chiese-BERTTokeizer

模型信息 Model Iformatio

为了得到一个大规模的中文版的BART(约BART-large的两倍),我们用悟道语料库(180G版本)进行预训练。具体地,我们在预训练阶段中使用了封神框架大概花费了8张A100约7天。值得注意的是,因为BERT分词器通常在中文任务中表现比其他分词器好,所以我们使用了它。我们也开放了我们预训练的代码:pretrairadegbart

To obtai a large-scale Chiese BART (aroud twice as large as BART-large), we use WuDao Corpora (180 GB versio) for pre-traiig. Specifically, we use the fegshe framework i the pre-traiig phase which cost about 7 days with 8 A100 GPUs. Note that sice the BERT tokeizer usually performs better tha others for Chiese tasks, we employ it. We have also released our pre-traiig code: pretrairadegbart.

使用 Usage

from trasformers import BartForCoditioalGeeratio, AutoTokeizer, Text2TextGeeratioPipelie
import torch

tokeizer=AutoTokeizer.from_pretraied('IDEA-CCNL/Radeg-BART-759M-Chiese-BertTokeizer', use_fast=false)
model=BartForCoditioalGeeratio.from_pretraied('IDEA-CCNL/Radeg-BART-759M-Chiese-BertTokeizer')
text = '桂林是著名的[MASK],它有很多[MASK]。'
text2text_geerator = Text2TextGeeratioPipelie(model, tokeizer)
prit(text2text_geerator(text, max_legth=50, do_sample=False))

引用 Citatio

如果您在您的工作中使用了我们的模型,可以引用我们的论文

If you are usig the resource for your work, please cite the our paper:

@article{fegshebag,
  author    = {Jujie Wag ad Yuxiag Zhag ad Li Zhag ad Pig Yag ad Xiyu Gao ad Ziwei Wu ad Xiaoqu Dog ad Juqig He ad Jiaheg Zhuo ad Qi Yag ad Yogfeg Huag ad Xiayu Li ad Yagha Wu ad Juyu Lu ad Xiyu Zhu ad Weifeg Che ad Tig Ha ad Kuhao Pa ad Rui Wag ad Hao Wag ad Xiaoju Wu ad Zhogshe Zeg ad Chogpei Che ad Ruyi Ga ad Jiaxig Zhag},
  title     = {Fegshebag 1.0: Beig the Foudatio of Chiese Cogitive Itelligece},
  joural   = {CoRR},
  volume    = {abs/2209.02970},
  year      = {2022}
}

也可以引用我们的网站:

You ca also cite our website:

@misc{Fegshebag-LM,
  title={Fegshebag-LM},
  author={IDEA-CCNL},
  year={2021},
  howpublished={\url{https://github.com/IDEA-CCNL/Fegshebag-LM}},
}

功能介绍

Randeng-BART-759M-Chinese-BertTokenizer Github: Fengshenbang-LM Docs: Fengshenbang-Docs 简介 Brief I

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论