MiniCPM-Embedding-Light
2.1K
13
—
by
openbmb
Embedding Model
OTHER
New
2K downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary
AI model with specialized capabilities.
Code Examples
使用方法 Usagetext
Instruction: {{ instruction }} Query: {{ query }}text
Instruction: 为这个医学问题检索相关回答。Query: 咽喉癌的成因是什么?text
Instruction: Given a claim about climate change, retrieve documents that support or refute the claim. Query: However the warming trend is slower than most climate models have forecast.[[0.40356746315956116, 0.36183440685272217]]python
import asyncio
from infinity_emb import AsyncEngineArray, EngineArgs, AsyncEmbeddingEngine
import numpy as np
array = AsyncEngineArray.from_args([
EngineArgs(model_name_or_path = "openbmb/MiniCPM-Embedding-Light", engine="torch", dtype="float16", bettertransformer=False, pooling_method="mean", trust_remote_code=True),
])
queries = ["中国的首都是哪里?"] # "What is the capital of China?"
passages = ["beijing", "shanghai"] # "北京", "上海"
INSTRUCTION = "Query:"
queries = [f"{INSTRUCTION} {query}" for query in queries]
async def embed_text(engine: AsyncEmbeddingEngine,sentences):
async with engine:
embeddings, usage = await engine.embed(sentences=sentences)
return embeddings
queries_embedding = asyncio.run(embed_text(array[0],queries))
passages_embedding = asyncio.run(embed_text(array[0],passages))
scores = (np.array(queries_embedding) @ np.array(passages_embedding).T)
print(scores.tolist()) # [[0.40356746315956116, 0.36183443665504456]]Deploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.