Model Details Arctic is a dense-MoE Hybrid transformer architecture pre-trained from scratch by the
490pytorchsnowflake
OpenELM: An Efficient Language Model Family with Open Training and Inference Framework Sachin Mehta,
480pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
350pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
380pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
380pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
370pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
400pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
390pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
490pytorch
OpenELM Sachin Mehta, Mohammad Hossein Sekhavat, Qingqing Cao, Maxwell Horton, Yanzi Jin, Chenfan Su
400pytorch
当前模型的贡献者未提供更加详细的模型介绍。模型文件和权重,可浏览“模型文件”页面获取。 您可以通过如下git clone命令,或者ModelScope SDK来下载模型 SDK下载 #安装ModelS
500
当前模型的贡献者未提供更加详细的模型介绍。模型文件和权重,可浏览“模型文件”页面获取。 您可以通过如下git clone命令,或者ModelScope SDK来下载模型 SDK下载 #安装ModelS
330
使用DPO数据(只使用中文部分训练,长对话偏好) DPO-EN-ZH-20k 包含大量偏好对齐的问答对数据,有助于进一步提升chat模型的对话质量,使其生成内容更加详细、适合人类偏好。
410
Description The Expert_Baike_Word2vec_11B_zh model was trained among 9 domain-specific Chinese corpu
250
Chinese-Alpaca-2-7B-RLHF This repository contains Chinese-Alpaca-2-7B-RLHF, which is tuned on Chines
310
Chinese-Alpaca-2-1.3B-RLHF This repository contains Chinese-Alpaca-2-1.3B-RLHF, which is tuned on Ch
310
当前模型的贡献者未提供更加详细的模型介绍。模型文件和权重,可浏览“模型文件”页面获取。 您可以通过如下git clone命令,或者ModelScope SDK来下载模型 SDK下载 #安装ModelS
450
Chinese-Alpaca-2-LoRA-13B-16K This is the LoRA model for Chinese-Alpaca-2-13B-16K (context size 16K)
330
Chinese-Alpaca-2-LoRA-7B-64K WARNING: LoRA model cannot be used alone. This repository contains Chin
380
Chinese-Alpaca-2-LoRA-7B-16K This is the LoRA model for Chinese-Alpaca-2-7B-16K (context size 16K),w
340
当前共162703个项目
×
寻找源码
源码描述
联系方式
提交