Erlangshen-SimCSE-110M-Chinese

我要开发同款
匿名用户2024年07月31日
53阅读

技术信息

官网地址
https://github.com/IDEA-CCNL/Fengshenbang-LM
开源地址
https://modelscope.cn/models/Fengshenbang/Erlangshen-SimCSE-110M-Chinese
授权协议
Apache License 2.0

作品详情

Erlagshe-SimCSE-110M-Chiese

简介 Brief Itroductio

基于simcse无监督版本,用搜集整理的中文NLI数据进行simcse有监督任务的训练。在中文句子对任务上有良好的效果。

Erlagshe-SimCSE-110M-Chiese is based o the usupervised versio of simcse, Ad traiig simcse supervised task with collected ad sorted chiese NLI data for. It has good effect o the task i Chiese seteces pair.

模型分类 Model Taxoomy

需求 Demad 任务 Task 系列 Series 模型 Model 参数 Parameter 额外 Extra
通用 Geeral 自然语言生成 NLU 二郎神 Erlagshe Bert 110M 中文 Chiese

模型信息 Model Iformatio

为了获得一个通用句子向量表征的模型,我们基于bert-base模型用了大量的无监督数据和有监督数据进行对比学习,最终获得了一个无需微调就能够利用模型输出的[CLS]进行相似度判断的模型。与用bert模型在针对任务微调后,再进行句子相似度任务不同,我们的模型在预训练完成后直接具备提取句子向量的能力。在一些任务上有如下的测评效果:

I order to obtai a geeral setece-embeddig-model, we use a large umber of usupervised data ad supervised data for comparative learig based o the Bert-base model, ad fially obtaied a model that ca use the [CLS] output from the model to judge the similarity without fie-tuig. Differet from the setece similarity task after fie tuig the task with the bert model, our model has the ability to extract setece vectors directly after pre traiig. I some tasks, the evaluatio results are as follows:

模型 LCQMC BQ PAWSX ATEC STS-B
Bert 62 38.62 17.38 28.98 68.27
Bert-large 63.78 37.51 18.63 30.24 68.87
RoBerta 67.3 39.89 16.79 30.57 69.
RoBerta large 67.25 38.39 19.09 30.85 69.36
RoFormer 63.58 39.9 17.52 29.37 67.32
SimBERT 73.43 40.98 15.87 31.24 72
Erlagshe-SimCSE-110M-Chiese 74.94 56.97 21.84 34.12 70.5

备注:我们的模型是直接用[cls],无whiteig;其余模型是last avg + whiteig

ps:Our model use [cls] directly,ad o whiteig;Other model use last avg ad do whiteig

使用 Usage

加载模型 Loadig Models

```pytho from trasformers import AutoTokeizer,AutoModelForMaskedLM model =AutoModelForMaskedLM.frompretraied('IDEA-CCNL/Erlagshe-SimCSE-110M-Chiese') tokeizer = AutoTokeizer.frompretraied('IDEA-CCNL/Erlagshe-SimCSE-110M-Chiese')

### 使用示例 Usage Examples

pytho import torch from sklear.metrics.pairwise import cosie_similarity

texta = '今天天气真不错,我们去散步吧!' textb = '今天天气真糟糕,还是在宅家里写bug吧!' iputsa = tokeizer(texta,returtesors="pt") iputsb = tokeizer(textb,returtesors="pt")

outputsa = model(**iputsa ,outputhiddestates=True) textaembeddig = outputsa.hidde_states[-1][:,0,:].squeeze()

outputsb = model(**iputsb ,outputhiddestates=True) textbembeddig = outputsb.hidde_states[-1][:,0,:].squeeze()

if you use cuda, the textembeddig should be textbembeddig.cpu().umpy()

或者用torch.o_grad():

with torch.ograd(): silimaritysoce = cosiesimilarity(textaembeddig.reshape(1,-1),textbembeddig .reshape(1,-1))[0][0] prit(silimaritysoce)

## 引用 Citatio

如果您在您的工作中使用了我们的模型,可以引用我们的[论文](https://arxiv.org/abs/2209.02970):

If you are usig the resource for your work, please cite the our [paper](https://arxiv.org/abs/2209.02970):

text @article{fegshebag, author = {Jujie Wag ad Yuxiag Zhag ad Li Zhag ad Pig Yag ad Xiyu Gao ad Ziwei Wu ad Xiaoqu Dog ad Juqig He ad Jiaheg Zhuo ad Qi Yag ad Yogfeg Huag ad Xiayu Li ad Yagha Wu ad Juyu Lu ad Xiyu Zhu ad Weifeg Che ad Tig Ha ad Kuhao Pa ad Rui Wag ad Hao Wag ad Xiaoju Wu ad Zhogshe Zeg ad Chogpei Che ad Ruyi Ga ad Jiaxig Zhag}, title = {Fegshebag 1.0: Beig the Foudatio of Chiese Cogitive Itelligece}, joural = {CoRR}, volume = {abs/2209.02970}, year = {2022} }

也可以引用我们的[网站](https://github.com/IDEA-CCNL/Fegshebag-LM/):

You ca also cite our [website](https://github.com/IDEA-CCNL/Fegshebag-LM/):

text @misc{Fegshebag-LM, title={Fegshebag-LM}, author={IDEA-CCNL}, year={2021}, howpublished={\url{https://github.com/IDEA-CCNL/Fegshebag-LM}}, } ```

功能介绍

Erlangshen-SimCSE-110M-Chinese Github: Fengshenbang-LM Docs: Fengshenbang-Docs 简介 Brief Introducti

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论