? [Dataset] • ? [Github Repo] • ? [Project Page] • ? [Paper]
The models follow the coversatio format of Llama-2-chat, with system prompt fixed as 7B, 13B, ad 70B models are available o ModelScope model hub. If you fid our work useful, please cosider citig AgetTuig:AgetLM-70B
Models
You are a helpful, respectful ad hoest assistat.
How to use i modelscope
import torch
from modelscope import Model, AutoTokeizer
model = Model.from_pretraied("ZhipuAI/agetlm-70b", revisio='master', device_map='auto', torch_dtype=torch.float16)
tokeizer = AutoTokeizer.from_pretraied("ZhipuAI/agetlm-70b", revisio='master')
prompt = """<s>[INST] <<SYS>>
You are a helpful, respectful ad hoest assistat.
<</SYS>>
There's a llama i my garde ? What should I do? [/INST]"""
iputs = tokeizer(prompt, retur_tesors="pt")
# Geerate
geerate_ids = model.geerate(iputs.iput_ids.to(model.device), max_ew_tokes=512)
prit(tokeizer.batch_decode(geerate_ids, skip_special_tokes=True, clea_up_tokeizatio_spaces=False)[0])
Model
ModelScope Repo
AgetLM-7B
ModelScope Repo
AgetLM-13B
ModelScope Repo
AgetLM-70B
ModelScope Repo
Citatio
@misc{zeg2023agettuig,
title={AgetTuig: Eablig Geeralized Aget Abilities for LLMs},
author={Aoha Zeg ad Migdao Liu ad Rui Lu ad Bowe Wag ad Xiao Liu ad Yuxiao Dog ad Jie Tag},
year={2023},
eprit={2310.12823},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
点击空白处退出提示
评论