基于NER微调的机器阅读理解模型

我要开发同款
匿名用户2024年07月31日
71阅读

技术信息

开源地址
https://modelscope.cn/models/iic/NER-PMR-Large
授权协议
Apache License 2.0

作品详情

NER-PMR-large

NER-PMR-large is iitialized with PMR-large ad further fie-tued with 4 NER traiig data, amely CoNLL, WNUT17, ACE2004, ad ACE2005.

The model performace o the test sets are:

CoNLL WNUT17 ACE2004 ACE2005
RoBERTa-large (sigle-task model) 92.8 57.1 86.3 87.0
PMR-large (sigle-task model) 93.6 60.8 87.5 87.4
NER-PMR-large (multi-task model) 92.9 54.7 87.8 88.4

Note that the performace of RoBERTa-large ad PMR-large are sigle-task fie-tuig, while NER-PMR-large is a multi-task fie-tued model. As it is fie-tued o multiple datasets, we believe that NER-PMR-large has a better geeralizatio capability to other NER tasks tha PMR-large ad RoBERTa-large.

How to use

You ca try the codes from this repo for both traiig ad iferece.

BibTeX etry ad citatio ifo

@article{xu2022clozig,
  title={From Clozig to Comprehedig: Retrofittig Pre-traied Laguage Model to Pre-traied Machie Reader},
  author={Xu, Weiwe ad Li, Xi ad Zhag, Wexua ad Zhou, Meg ad Big, Lidog ad Lam, Wai ad Si, Luo},
  joural={arXiv preprit arXiv:2212.04755},
  year={2022}
}

功能介绍

NER-PMR-large NER-PMR-large is initialized with PMR-large and further fine-tuned with 4 NER training

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论