MixTAO 7Bx2-MoE-v8.1 GGUF

我要开发同款
匿名用户2024年07月31日
27阅读
所属分类aiPytorch
开源地址https://modelscope.cn/models/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF
授权协议Apache License 2.0

作品详情

We are a small but sophisticated creative team that upholds high scientific standards. We are committed to developing efficient, practical, and powerful AI models with an emphasis on deep research and fast business thinking. Our goal is to provide powerful generative modeling solutions for software builders and enterprise users to advance AI technology.

We work quickly and efficiently as a team, with a strong sense of personal responsibility and teamwork. We attach great importance to internal transparency and believe it is the key to efficiency. We are rigorous and pragmatic and believe that this is the way to build the best technology.

We are creative and understand that landing the best AI model is a challenge to find unique efficiency accelerators. We are application-driven and hope that our optimized, industry-best pre-trained models can really help with real-world applications. Our team has diverse talent from all fields with extensive AI expertise.

模型文件和权重,请浏览“模型文件”页面获取。
当前模型的贡献者未提供更加详细的模型介绍,但是您可以通过如下git clone命令,或者ModelScope SDK来下载模型。
Clone with HTTP
git clone https://www.modelscope.cn/zhengr/MixTAO-7Bx2-MoE-v8.1-GGUF.git
如果您是本模型的贡献者,我们邀请您根据模型贡献文档说明,及时完善模型卡片内容。
声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论