chinese-mixtral-instruct-lora

我要开发同款
匿名用户2024年07月31日
20阅读
所属分类aiPytorch
开源地址https://modelscope.cn/models/ChineseAlpacaGroup/chinese-mixtral-instruct-lora
授权协议apache-2.0

作品详情

Chinese-Mixtral-Instruct-LoRA

Chinese Mixtral GitHub repository: https://github.com/ymcui/Chinese-Mixtral

This repository contains Chinese-Mixtral-Instruct-LoRA, which is further tuned with instruction data on Chinese-Mixtral, where Chinese-Mixtral is build on top of Mixtral-8x7B-v0.1.

Note: You must combine LoRA with the original Mixtral-8x7B-v0.1 to obtain full weight.

Others

  • For full model, please see: https://huggingface.co/hfl/chinese-mixtral-instruct

  • For GGUF model (llama.cpp compatible), please see: https://huggingface.co/hfl/chinese-mixtral-instruct-gguf

  • If you have questions/issues regarding this model, please submit an issue through https://github.com/ymcui/Chinese-Mixtral/.

Citation

Please consider cite our paper if you use the resource of this repository. Paper link: https://arxiv.org/abs/2403.01851

@article{chinese-mixtral,
      title={Rethinking LLM Language Adaptation: A Case Study on Chinese Mixtral}, 
      author={Cui, Yiming and Yao, Xin},
      journal={arXiv preprint arXiv:2403.01851},
      url={https://arxiv.org/abs/2403.01851},
      year={2024}
}
声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论