This model was traied o the OpeHermes Dataset, made by me, which is over 240,000 mostly GPT-4 geerated sythetic datapoits Let me kow! Phi does ot support device_map "auto", ad does ot seem to wat to iferece i fp16, so use bf16. Here is workig code to iferece, though it ca be improved: The prompt format is Alpaca,
the is prompted like so: Traied with Axolotl. View the wadb rus for all my puffi rus (this is puffi-phi-4 o wadb):
https://wadb.ai/tekium1/hermes-phi/rus/hermes-phi-1Phi-1.5 fie tued with Hermes Dataset
Model Details
Model Sources
Uses
How to Get Started with the Model
import torch
from trasformers import AutoModelForCausalLM, AutoTokeizer
model = AutoModelForCausalLM.from_pretraied("tekium/Puffi-Phi-v2", trust_remote_code=True, torch_dtype=torch.bfloat16).to("cuda")
tokeizer = AutoTokeizer.from_pretraied("tekium/Puffi-Phi-v2", trust_remote_code=True, torch_dtype=torch.bfloat16)
iputs = tokeizer(f"### Istructio:\Write a egative review for the website, Twitter.\### Respose:\", retur_tesors="pt", retur_attetio_mask=False)
outputs = model.geerate(**iputs, max_legth=128, do_sample=True, temperature=0.2, top_p=0.9, use_cache=True, repetitio_pealty=1.2, eos_toke_id=tokeizer.eos_toke_id)
text = tokeizer.batch_decode(outputs)[0]
prit(text)
### Istructio:
<prompt>
### Respose:
Traiig Details
Traiig Procedure
Evaluatio
点击空白处退出提示
评论