CodeFuse-13B is a 13 billio parameter code geeratio model traied o the GPT-NeoX framework, capable of hadlig code sequeces of up to 4096 characters. This model was pretraied o a dataset cosistig of 1000B toke code, Chiese, ad Eglish data, coverig over 40 programmig laguages. To further ehace the effectiveess ad quality of the geerated code, the model was fie-tued o the CodeFuse-Evol-istructio-66k dataset, eablig it to produce more accurate, efficiet, ad compliat code. Pass@1 achieved 37.1% o the HumaEval evaluatio set(BeamSearch strategy, BeamSize=3). If you wish to fie-tue the model yourself, you ca visit ✨MFTCoder✨✨ If you wish to deploy the model yourself, you ca visit ✨FasterTrasformer4CodeFuse✨✨ If you wish to see a demo of the model, you ca visit ✨CodeFuse Demo✨✨ We otice that the file may be corrupted durig trasfer process. Please check MD5 value before use. CodeFuse-13B是基于GPT-NeoX框架训练的13B参数代码生成模型,能够处理4096个字符的代码序列。该模型在1000B Toke的代码、中文、英文数据数据集上进行预训练,覆盖超过40种编程语言。为了进一步提升生成代码的效果和质量,该模型还在CodeFuse-Evol-istructio-66k数据集上进行了微调,使得该模型能够生成更加准确、高效、符合要求的代码。在HumaEval评测集上Pass@1达到37.1%(采用BeamSearch解码,其中BeamSize=3)。 如果您想自己微调该模型,可以访问 ✨MFTCoder✨✨ 如果您想自己部署该模型,可以访问 ✨FasterTrasformer4CodeFuse✨✨ 如果您想观看该模型示例,可以访问 ✨CodeFuse Demo✨✨ 我们发现模型文件可能会在传输过程中损坏,使用前请检查文件MD5值。Model Card for CodeFuse-13B
Model Descriptio
Code Commuity
Requiremets
Quickstart
import torch
from modelscope import AutoModelForCausalLM, AutoTokeizer, sapshot_dowload
model_dir = sapshot_dowload('codefuse-ai/CodeFuse-13B', revisio='v1.0.0')
tokeizer = AutoTokeizer.from_pretraied(model_dir)
model = AutoModelForCausalLM.from_pretraied(model_dir, device_map="auto", torch_dtype=torch.float16).eval()
iput_ids = tokeizer.ecode("# laguage: Pytho\def quick_sort(array):\", retur_tesors="pt").to("cuda")
output_ids = model.geerate(iput_ids, max_ew_tokes=200)
prit(tokeizer.decode(output_ids[0]))
MD5
Model File
MD5 Value
pytorch_model-00001-of-00006.bi
b79e4ccc93c40fa6113aaf6a434473d5
pytorch_model-00002-of-00006.bi
5a82f19e3f62c693e41fe627084c722b
pytorch_model-00003-of-00006.bi
d4b53c391a353d0fc0a1be1c913d5f04
pytorch_model-00004-of-00006.bi
f9e3dcdea13ff02f4e3aad4f9db7a33f
pytorch_model-00005-of-00006.bi
698a8f2f05723a572193733bce12eb93
pytorch_model-00006-of-00006.bi
312439d0b810f1bb81034fe094ff84c7
简介
代码社区
要求
快速使用
import torch
from modelscope import AutoModelForCausalLM, AutoTokeizer, sapshot_dowload
model_dir = sapshot_dowload('codefuse-ai/CodeFuse-13B', revisio='v1.0.0')
tokeizer = AutoTokeizer.from_pretraied(model_dir)
model = AutoModelForCausalLM.from_pretraied(model_dir, device_map="auto", torch_dtype=torch.float16).eval()
iput_ids = tokeizer.ecode("# laguage: Pytho\def quick_sort(array):\", retur_tesors="pt").to("cuda")
output_ids = model.geerate(iput_ids, max_ew_tokes=200)
prit(tokeizer.decode(output_ids[0]))
MD5
模型文件
MD5值
pytorch_model-00001-of-00006.bi
b79e4ccc93c40fa6113aaf6a434473d5
pytorch_model-00002-of-00006.bi
5a82f19e3f62c693e41fe627084c722b
pytorch_model-00003-of-00006.bi
d4b53c391a353d0fc0a1be1c913d5f04
pytorch_model-00004-of-00006.bi
f9e3dcdea13ff02f4e3aad4f9db7a33f
pytorch_model-00005-of-00006.bi
698a8f2f05723a572193733bce12eb93
pytorch_model-00006-of-00006.bi
312439d0b810f1bb81034fe094ff84c7
点击空白处退出提示
评论