官网:https://ollama.com/ Widows系统查看ollama 帮助命令 模型下载到本地默认目录: Ollama supports importig GGUF models i the Modelfile: Ollama has a REST API for ruig ad maagig models. Geerate a respose Chat with a model SDK下载 Git下载 如果您是本模型的贡献者,我们邀请您根据模型贡献文档,及时完善模型卡片内容。Ollama
ollama.exe serve --help
~/.ollama/models
常用命令
#Pull a model
ollama pull llama2
#Remove a model
ollama rm llama2
#Copy a model
ollama cp llama2 my-llama2
#List models o your computer
ollama list
#查看模型信息
ollama show --modelfile mistral
如何导入Model
#1.Create a file amed Modelfile, with a FROM istructio with the local filepath to the model you wat to import.
FROM ./vicua-33b.Q4_0.gguf
#2.Create the model i Ollama
ollama create example -f Modelfile
#3.Ru the model
ollama ru example
REST API
curl http://localhost:11434/api/geerate -d '{
"model": "llama2",
"prompt":"Why is the sky blue?"
}'
curl http://localhost:11434/api/chat -d '{
"model": "mistral",
"messages": [
{ "role": "user", "cotet": "why is the sky blue?" }
]
}'
当前模型的贡献者未提供更加详细的模型介绍。模型文件和权重,可浏览“模型文件”页面获取。
您可以通过如下git cloe命令,或者ModelScope SDK来下载模型
#安装ModelScope
pip istall modelscope
#SDK模型下载
from modelscope import sapshot_dowload
model_dir = sapshot_dowload('liush99/ollama_models')
#Git模型下载
git cloe https://www.modelscope.c/liush99/ollama_models.git
点击空白处退出提示
评论