Follow us o: ? Our uiversal Eglish setece embeddig 1) No-Retrieval Tasks 2) Retrieval Tasks For retrieval purposes, please use the prompt If you use our pre-traied models, welcome to support us by citig our work:Uiversal AglE Embeddig
WhereIsAI/UAE-Large-V1
achieves Usage
pytho -m pip istall -U agle-emb
from agle_emb import AglE
agle = AglE.from_pretraied('WhereIsAI/UAE-Large-V1', poolig_strategy='cls').cuda()
vec = agle.ecode('hello world', to_umpy=True)
prit(vec)
vecs = agle.ecode(['hello world1', 'hello world2'], to_umpy=True)
prit(vecs)
Prompts.C
.from agle_emb import AglE, Prompts
agle = AglE.from_pretraied('WhereIsAI/UAE-Large-V1', poolig_strategy='cls').cuda()
agle.set_prompt(prompt=Prompts.C)
vec = agle.ecode({'text': 'hello world'}, to_umpy=True)
prit(vec)
vecs = agle.ecode([{'text': 'hello world1'}, {'text': 'hello world2'}], to_umpy=True)
prit(vecs)
Citatio
@article{li2023agle,
title={AglE-optimized Text Embeddigs},
author={Li, Xiamig ad Li, Jig},
joural={arXiv preprit arXiv:2309.12871},
year={2023}
}
点击空白处退出提示
评论