This is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
This is a clone of the original model, with `pipelinetag` metadata changed to `feature-extraction`, so it can just return the embedded vector. Otherwise it is unchanged.
Using this model becomes easy when you have sentence-transformers installed:
Usage (HuggingFace Transformers) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.
For an automated evaluation of this model, see the Sentence Embeddings Benchmark: https://seb.sbert.net
If you find this model helpful, feel free to cite our publication Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks: