Erlangshen-Longformer-330M

我要开发同款
匿名用户2024年07月31日
31阅读
所属分类ai、longformer、pytorch
开源地址https://modelscope.cn/models/Fengshenbang/Erlangshen-Longformer-330M
授权协议Apache License 2.0

作品详情

Erlangshen-Longformer-330M

简介 Brief Introduction

善于处理长文本,采用旋转位置编码的中文版3.3亿参数的Longformer-large

The Chinese Longformer-large (330M), which uses rotating positional encoding, is adept at handling lengthy text.

模型分类 Model Taxonomy

需求 Demand 任务 Task 系列 Series 模型 Model 参数 Parameter 额外 Extra
通用 General 自然语言理解 NLU 二郎神 Erlangshen Longformeer 330M 中文 Chinese

模型信息 Model Information

遵循Longformer-large的设计,我们基于chineseroformerL-12H-768A-12,在悟道语料库(180 GB版本)上进行了继续预训练。特别的,我们采用旋转位置嵌入(RoPE)来避免预训练语料库的不均匀序列长度问题。

Following the design of Longformer-large, we performed continual pre-training on the WuDao corpus (180 GB) based on chineseroformerL-12H-768A-12. Particularly, we employed rotational position embedding (RoPE) to avoid the uneven sequence length of the pre-trained corpus.

使用 Usage

因为transformers库中是没有Longformer-large相关的模型结构的,所以你可以在我们的Fengshenbang-LM中找到并且运行代码。

Since there is no structure of Longformer-large in transformers library, you can find the structure of Longformer-base and run the codes in Fengshenbang-LM.

 git clone https://github.com/IDEA-CCNL/Fengshenbang-LM.git

加载模型 Loading Models

from fengshen import LongformerModel    
from fengshen import LongformerConfig
from transformers import BertTokenizer

tokenizer = BertTokenizer.from_pretrained("IDEA-CCNL/Erlangshen-Longformer-330M")
config = LongformerConfig.from_pretrained("IDEA-CCNL/Erlangshen-Longformer-330M")
model = LongformerModel.from_pretrained("IDEA-CCNL/Erlangshen-Longformer-330M")

引用 Citation

如果您在您的工作中使用了我们的模型,可以引用我们的论文

If you are using the resource for your work, please cite the our paper:

@article{fengshenbang,
  author    = {Junjie Wang and Yuxiang Zhang and Lin Zhang and Ping Yang and Xinyu Gao and Ziwei Wu and Xiaoqun Dong and Junqing He and Jianheng Zhuo and Qi Yang and Yongfeng Huang and Xiayu Li and Yanghan Wu and Junyu Lu and Xinyu Zhu and Weifeng Chen and Ting Han and Kunhao Pan and Rui Wang and Hao Wang and Xiaojun Wu and Zhongshen Zeng and Chongpei Chen and Ruyi Gan and Jiaxing Zhang},
  title     = {Fengshenbang 1.0: Being the Foundation of Chinese Cognitive Intelligence},
  journal   = {CoRR},
  volume    = {abs/2209.02970},
  year      = {2022}
}

也可以引用我们的网站:

You can also cite our website:

@misc{Fengshenbang-LM,
  title={Fengshenbang-LM},
  author={IDEA-CCNL},
  year={2021},
  howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}},
}
声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论