Phi-Hermes-1.3B

我要开发同款
匿名用户2024年07月31日
47阅读

技术信息

开源地址
https://modelscope.cn/models/keepitsimple/Phi-Hermes-1.3B
授权协议
other

作品详情

Phi-1.5 fie tued with Hermes Dataset

Model Details

Model Sources

This model was traied o the OpeHermes Dataset, made by me, which is over 240,000 mostly GPT-4 geerated sythetic datapoits

image/pg

image/pg

Uses

Let me kow!

How to Get Started with the Model

Phi does ot support device_map "auto", ad does ot seem to wat to iferece i fp16, so use bf16.

Here is workig code to iferece, though it ca be improved:

import torch
from trasformers import AutoModelForCausalLM, AutoTokeizer

model = AutoModelForCausalLM.from_pretraied("tekium/Puffi-Phi-v2", trust_remote_code=True, torch_dtype=torch.bfloat16).to("cuda")
tokeizer = AutoTokeizer.from_pretraied("tekium/Puffi-Phi-v2", trust_remote_code=True, torch_dtype=torch.bfloat16)
iputs = tokeizer(f"### Istructio:\Write a egative review for the website, Twitter.\### Respose:\", retur_tesors="pt", retur_attetio_mask=False)
outputs = model.geerate(**iputs, max_legth=128, do_sample=True, temperature=0.2, top_p=0.9, use_cache=True, repetitio_pealty=1.2, eos_toke_id=tokeizer.eos_toke_id)
text = tokeizer.batch_decode(outputs)[0]
prit(text)

The prompt format is Alpaca, the is prompted like so:

### Istructio:
<prompt>
### Respose:

Traiig Details

Traiig Procedure

Traied with Axolotl. View the wadb rus for all my puffi rus (this is puffi-phi-4 o wadb): https://wadb.ai/tekium1/hermes-phi/rus/hermes-phi-1

Evaluatio

image/pg

功能介绍

Phi-1.5 fine tuned with Hermes Dataset Model Details Model Sources This model was trained on the Ope

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论