We coducted tests o WidsoMetor usig authoritative datasets i various domais, icludig Geeral ad Mathematics. We evaluated WidsoMetor o three authoritative datasets i the geeral domai: C-Eval, MMLU, ad CMMLU. These datasets cover comprehesive evaluatios of Chiese ad Eglish base models, as well as comprehesio ad reasoig abilities i Chiese cotexts. Next, we will demostrate iferece usig FastChat, Trasformers, ModelScope, ad Web demo.
The dialogue model adopts the chatml format to support geeral dialogue ad aget applicatios.
To esure better usability, please istall the depedecies as istructed below before performig iferece usig Trasformers or ModelScope. """ Modify the code below to load the WisdoMetor-8b model from ModelScope, cosiderig your local computatioal resources. You ca replace the model ame with differet sizes of WisdoMetor. Modify the code below to load the WisdoMetor-8b model from Huggigface, cosiderig your local computatioal resources. You ca replace the model ame with differet sizes of WisdoMetor. We hereby declare that our developmet team has ot developed ay applicatios based o the WisdoMetor model, whether o iOS, Adroid, web, or ay other platform. We strogly urge all users ot to utilize the WisdoMetor model for ay activities that may jeopardize atioal or social security or violate the law. Furthermore, we request users ot to use the WisdoMetor model for iteret services without proper security review ad filig. We hope that all users will abide by this priciple to esure that techological advacemets occur i a regulated ad lawful eviromet.Model Itroductio
Performace o Bechmark
Geeral Domai
Performace of WisdoMetor-8B
5-shot
5-shot
5-shot
68.40
83.93
70.33
51.10
68.54
54.06
27.10
35.10
26.75
28.90
45.73
31.38
27.15
27.93
26.00
24.23
26.03
25.66
50.20
45.90
49.00
Math Ability
Code Ability
Iferece ad Deploymet
Istall Depedecies
git cloe https://www.modelscope.c/lijh1118/WisdoMetor-8b
coda create --ame WisdoMetor pytho=3.11.8
coda activate WisdoMetor
pip istall -r requiremets.txt
Deployig Iferece with FastChat
git cloe https://www.modelscope.c/lijh1118/WisdoMetor-8b path_to_local_WisdoMetor-8b
cd path_to_local_WisdoMetor-8b
pytho -m fastchat.serve.cli --model-path path_to_local_WisdoMetor-8b
问: 介绍下bert和gpt有什么区别
答: Bert (Bidirectioal Ecoder Represetatios from Trasformers) 和 GPT (Geerative Pre-traied Trasformers)都是预训练的自然语言处理模型,但它们的预训练任务和应用场景有所不同。 Bert通过双向的编码器结构预训练,能够捕捉到句子的上下文信息,具有非常好的语言理解和语义表示能力。Bert预训练的任务是通过将句子的两部分分别用双语标记,然后预测这两个部分之间的关系来完成的。Bert在自然语言处理任务中表现出色,如文本分类、情感分析、命名实体识别、文本匹配、问答系统和文本摘要等。 GPT是基于单向的编码器结构,与BERT不同。GPT预训练的任务是通过将文本中的单词和句子分别标识,然后预测下一个单词来完成的。GPT在自然语言处理任务中也表现出色,如文本生成、对话系统、机器翻译、问答系统等。 虽然Bert和GPT预训练的任务不同,但它们都是预训练的自然语言处理模型,在处理特定任务时可以进行微调,从而实现更好的性能。选择使用BERT还是GPT取决于具体任务的需求和目标。
Deployig Iferece with ModelScope
import torch
from modelscope import sapshot_dowload, AutoTokeizer, AutoModelForCausalLM
model_dir = sapshot_dowload('lijh1118/WisdoMetor-8b')
tokeizer = AutoTokeizer.from_pretraied(model_dir, device_map="auto", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretraied(model_dir, device_map="auto", trust_remote_code=True, torch_dtype=torch.float16)
model = model.eval()
respose, history = model.chat(tokeizer, "请介绍下Bert和GPT的区别", history=[])
prit(respose)
respose, history = model.chat(tokeizer, "请介绍下Self-Attetio机制", history=history)
prit(respose)
Deployig Iferece with Huggigface
import torch
from trasformers import AutoTokeizer, AutoModelForCausalLM
tokeizer = AutoTokeizer.from_pretraied("lijh1118/WisdoMetor-8b", trust_remote_code=True)
model = AutoModelForCausalLM.from_pretraied("lijh1118/WisdoMetor-8b", device_map="auto",trust_remote_code=True, torch_dtype=torch.float16)
# 4-bit Quatizatio
# pip istall -U bitsadbytes
# 8-bit: model = AutoModelForCausalLM.from_pretraied(model_dir, device_map="auto", trust_remote_code=True, load_i_8bit=True)
# 4-bit: model = AutoModelForCausalLM.from_pretraied(model_dir, device_map="auto", trust_remote_code=True, load_i_4bit=True)
model = model.eval()
respose, history = model.chat(tokeizer, "请介绍下Bert和GPT的区别", history=[])
prit(respose)
respose, history = model.chat(tokeizer, "请介绍下Self-Attetio机制", history=history)
prit(respose)
Declaratio ad Agreemet
Declaratio
点击空白处退出提示
评论