bert分类器

我要开发同款
匿名用户2024年07月31日
17阅读
所属分类ai、bert
开源地址https://modelscope.cn/models/mayor1830/bert_classifier

作品详情

Bert_classifier 这是一个用于问题分类的模型,配合Fourth Dimension的一些版本进行使用

以下为源模型bert-base-chinese的readme.md

Bert-base-chinese Table of Contents Model Details Uses Risks, Limitations and Biases Training Evaluation How to Get Started With the Model Model Details Model Description This model has been pre-trained for Chinese, training and random input masking has been applied independently to word pieces (as in the original BERT paper).

Developed by: HuggingFace team Model Type: Fill-Mask Language(s): Chinese License: [More Information needed] Parent Model: See the BERT base uncased model for more information about the BERT base model. Model Sources Paper: BERT Uses Direct Use This model can be used for masked language modeling

Risks, Limitations and Biases CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.

Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).

Training Training Procedure typevocabsize: 2 vocabsize: 21128 numhidden_layers: 12 Training Data [More Information Needed]

Evaluation Results [More Information Needed]

How to Get Started With the Model from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("bert-base-chinese")

model = AutoModelForMaskedLM.from_pretrained("bert-base-chinese")

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论