匿名用户2024年07月31日
27阅读
所属分类ai、bert、pytorch、bert
开源地址https://modelscope.cn/models/dienstag/rbt4-h312
授权协议apache-2.0

作品详情

示例代码

from modelscope.pipelines import pipeline
from modelscope.utils.constant import Tasks


pipeline_ins = pipeline(
        'fill-mask',
        model='dienstag/rbt4-h312',
        model_revision='v1.0.0'
)

print(pipeline_ins('生活的真谛是[MASK]。'))

Please use 'Bert' related functions to load this model!

Chinese small pre-trained model MiniRBT

In order to further promote the research and development of Chinese information processing, we launched a Chinese small pre-training model MiniRBT based on the self-developed knowledge distillation tool TextBrewer, combined with Whole Word Masking technology and Knowledge Distillation technology.

This repository is developed based on:https://github.com/iflytek/MiniRBT

You may also interested in,

  • Chinese LERT: https://github.com/ymcui/LERT
  • Chinese PERT: https://github.com/ymcui/PERT
  • Chinese MacBERT: https://github.com/ymcui/MacBERT
  • Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
  • Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
  • Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer

More resources by HFL: https://github.com/iflytek/HFL-Anthology

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论