chinese-pert-large-mrc

我要开发同款
匿名用户2024年07月31日
23阅读
所属分类ai、bert、pytorch
开源地址https://modelscope.cn/models/dienstag/chinese-pert-large-mrc
授权协议apache-2.0

作品详情

A Chinese MRC model built on Chinese PERT-large

Please use BertForQuestionAnswering to load this model!

This is a Chinese machine reading comprehension (MRC) model built on PERT-large and fine-tuned on a mixture of Chinese MRC datasets.

PERT is a pre-trained model based on permuted language model (PerLM) to learn text semantic information in a self-supervised manner without introducing the mask tokens [MASK]. It yields competitive results on in tasks such as reading comprehension and sequence labeling.

Results on Chinese MRC datasets (EM/F1):

(We report the checkpoint that has the best AVG score)

CMRC 2018 Dev DRCD Dev SQuAD-Zen Dev (Answerable) AVG
PERT-large 73.5/90.8 91.2/95.7 63.0/79.3 75.9/88.6

Please visit our GitHub repo for more information: https://github.com/ymcui/PERT

You may also be interested in,

Chinese Minority Languages CINO: https://github.com/ymcui/Chinese-Minority-PLM
Chinese MacBERT: https://github.com/ymcui/MacBERT
Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer

More resources by HFL: https://github.com/ymcui/HFL-Anthology

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论