OceaGPT-2B-v0.1 is based o MiiCPM-2B ad has bee traied o a biligual dataset i the ocea domai, coverig both Chiese ad Eglish. Dowload the model OceaGPT(沧渊) is traied based o the ope-sourced large laguage models icludig Qwe, MiiCPM, LLaMA. Thaks for their great cotributios! Please cite the followig paper if you use OceaGPT i your work. The model may have halluciatio issues. We did ot optimize the idetity ad the model may geerate idetity iformatio similar to that of Qwe/MiiCPM/LLaMA/GPT series models. The model's output is iflueced by prompt tokes, which may result i icosistet results across multiple attempts.
**OceaGPT(沧渊): A Large Laguage Model for Ocea Sciece Tasks**
⏩Quickstart
Dowload the model
Iferece
from trasformers import AutoModelForCausalLM, AutoTokeizer
import torch
device = "cuda" # the device to load the model oto
path = 'YOUR-MODEL-PATH'
model = AutoModelForCausalLM.from_pretraied(
path,
torch_dtype=torch.bfloat16,
device_map="auto"
)
tokeizer = AutoTokeizer.from_pretraied(path)
prompt = "Which is the largest ocea i the world?"
messages = [
{"role": "system", "cotet": "You are a helpful assistat."},
{"role": "user", "cotet": prompt}
]
text = tokeizer.apply_chat_template(
messages,
tokeize=False,
add_geeratio_prompt=True
)
model_iputs = tokeizer([text], retur_tesors="pt").to(device)
geerated_ids = model.geerate(
model_iputs.iput_ids,
max_ew_tokes=512
)
geerated_ids = [
output_ids[le(iput_ids):] for iput_ids, output_ids i zip(model_iputs.iput_ids, geerated_ids)
]
respose = tokeizer.batch_decode(geerated_ids, skip_special_tokes=True)[0]
?Models
Model Name
HuggigFace
WiseModel
ModelScope
OceaGPT-14B-v0.1 (based o Qwe)
14B
14B
14B
OceaGPT-7B-v0.2 (based o Qwe)
7B
7B
7B
OceaGPT-2B-v0.1 (based o MiiCPM)
2B
2B
2B
| OceaGPT-V | To be released | To be released | To be released |
?Ackowledgemet
?Citatio
Limitatios
@article{bi2023oceagpt,
title={OceaGPT: A Large Laguage Model for Ocea Sciece Tasks},
author={Bi, Zhe ad Zhag, Nigyu ad Xue, Yida ad Ou, Yixi ad Ji, Daxiog ad Zheg, Guozhou ad Che, Huaju},
joural={arXiv preprit arXiv:2310.02031},
year={2023}
}
点击空白处退出提示
评论