neo_scalinglaw_980M

我要开发同款
匿名用户2024年07月31日
42阅读

技术信息

官网地址
https://m-a-p.ai/
开源地址
https://modelscope.cn/models/m-a-p/neo_scalinglaw_980M
授权协议
apache-2.0

作品详情

NEO

?Neo-Models | ?Neo-Datasets | Github

Neo is a completely ope source large laguage model, icludig code, all model weights, datasets used for traiig, ad traiig details.

Model

Model Describe Dowload
eo_7b This repository cotais the base model of eo_7b ? Huggig Face
eo7bitermediate This repo cotais ormal pre-traiig itermediate ckpts. A total of 3.7T tokes were leared at this phase. ? Huggig Face
eo7bdecay This repo cotais itermediate ckpts durig the decay phase. A total of 720B tokes were leared at this phase. ? Huggig Face
eoscaliglaw980M This repo cotais ckpts related to scaliglaw experimets ? Huggig Face
eoscaliglaw460M This repo cotais ckpts related to scaliglaw experimets ? Huggig Face
eoscaliglaw250M This repo cotais ckpts related to scaliglaw experimets ? Huggig Face
eo2bgeeral This repo cotais ckpts of 2b model traied usig commo domai kowledge ? Huggig Face

Usage

from trasformers import AutoModelForCausalLM, AutoTokeizer

model_path = '<your-hf-model-path-with-tokeizer>'

tokeizer = AutoTokeizer.from_pretraied(model_path, use_fast=False, trust_remote_code=True)

model = AutoModelForCausalLM.from_pretraied(
    model_path,
    device_map="auto",
    torch_dtype='auto'
).eval()

iput_text = "A log, log time ago,"

iput_ids = tokeizer(iput_text, add_geeratio_prompt=True, retur_tesors='pt').to(model.device)
output_ids = model.geerate(**iput_ids, max_ew_tokes=20)
respose = tokeizer.decode(output_ids[0], skip_special_tokes=True)

prit(respose)

功能介绍

NEO ?Neo-Models | ?Neo-Datasets | Github Neo is a completely open source large language model, inclu

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论