granite-st-infonce-stratified

1
by
mixedbread-ai
Embedding Model
OTHER
New
0 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary

AI model with specialized capabilities.

Code Examples

Usagebash
pip install -U sentence-transformers
Usagepython
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'Handle different commands',
    '                else if (command.asString() == "_checkbw")\n                {\n                    if (type == Type::HOST)\n                    {\n                        sendCheckBWResult(transactionId.asDouble());\n                    }\n                    else\n                    {\n                        Log(Log::Level::INFO) << idString << "Invalid message (\\"_checkbw\\"), disconnecting";\n                        close();\n                        return false;\n                    }\n                }\n                else if (command.asString() == "createStream")\n                {\n                    if (type == Type::HOST)\n                    {\n                        sendCreateStreamResult(transactionId.asDouble());\n                    }\n                    else\n                    {\n                        Log(Log::Level::INFO) << idString << "Invalid message (\\"createStream\\"), disconnecting";\n                        close();\n                        return false;\n                    }\n                }\n                else if (command.asString() == "releaseStream")\n                {\n                    if (type == Type::HOST)\n                    {\n                        sendReleaseStreamResult(transactionId.asDouble());\n                    }\n                    else\n                    {\n                        Log(Log::Level::INFO) << idString << "Invalid message (\\"releaseStream\\"), disconnecting";\n                        close();\n                        return false;\n                    }\n                }\n                else if (command.asString() == "deleteStream")\n                {\n                    if (type == Type::HOST)\n                    {\n                        if (stream)\n                        {\n                            close();\n                        }\n                    }\n                    else\n                    {\n                        Log(Log::Level::INFO) << idString << "Invalid message (\\"deleteStream\\"), disconnecting";\n                        close();\n                        return false;\n                    }\n                }',
    '        else if(args[i].compare("-v") == 0)\n        {\n            if(i == argc-1)\n                throw invalid_argument("need to specify number of folds\\\n                                        after -v");\n            i++;\n\n            if(!is_numerical(argv[i]))\n                throw invalid_argument("-v should be followed by a number");\n            option.nr_folds = atoi(argv[i]);\n\n            if(option.nr_folds < 2)\n                throw invalid_argument("number of folds\\\n                                        must be greater than one");\n            option.do_cv = true;\n        }\n        else if(args[i].compare("-f") == 0)\n        {\n            if(i == argc-1)\n                throw invalid_argument("need to specify loss function\\\n                                        after -f");\n            i++;\n\n            if(!is_numerical(argv[i]))\n                throw invalid_argument("-f should be followed by a number");\n            option.param.fun = atoi(argv[i]);\n        }\n        else if(args[i].compare("-n") == 0)\n        {\n            if(i == argc-1)\n                throw invalid_argument("need to specify the number of blocks\\\n                                        after -n");\n            i++;\n\n            if(!is_numerical(argv[i]))\n                throw invalid_argument("-n should be followed by a number");\n            option.param.nr_bins = atoi(argv[i]);\n        }\n        else if(args[i].compare("--nmf") == 0)\n        {\n            option.param.do_nmf = true;\n        }\n        else if(args[i].compare("--quiet") == 0)\n        {\n            option.param.quiet = true;\n        }\n        else if(args[i].compare("--disk") == 0)\n        {\n            option.on_disk = true;\n        }\n        else\n        {\n            break;\n        }\n    }',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0000, 0.4449, 0.2997],
#         [0.4449, 1.0000, 0.2677],
#         [0.2997, 0.2677, 1.0000]])

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.