匿名用户2024年07月31日
26阅读
所属分类ai、xlm-roberta、xinference
开源地址https://modelscope.cn/models/Xorbits/bge-m3
授权协议Apache License 2.0

作品详情

bge-m3

This repo is a mirror of embedding model bge-m3.

Information

  • dimensions: 1024
  • max_tokens: 8192
  • language: zh, en

Example code

Install packages

pip install xinference[ggml]>=0.4.3

If you want to run with GPU acceleration, refer to installation.

Start a local instance of Xinference

xinference -p 9997

Launch and inference

from xinference.client import Client

client = Client("http://localhost:9997")
model_uid = client.launch_model(
    model_name="bge-m3",
    model_type="embedding"
    )
model = client.get_model(model_uid)

input_text = "What is the capital of China?"
model.create_embedding(input_text)

More information

Xinference Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you are empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.

? Join our Slack community!

声明:本文仅代表作者观点,不代表本站立场。如果侵犯到您的合法权益,请联系我们删除侵权资源!如果遇到资源链接失效,请您通过评论或工单的方式通知管理员。未经允许,不得转载,本站所有资源文章禁止商业使用运营!
下载安装【程序员客栈】APP
实时对接需求、及时收发消息、丰富的开放项目需求、随时随地查看项目状态

评论