gte-reranker-modernbert-base
598.7K
77
8K
GPT-3 class
194M
1 language
FP16
license:apache-2.0
by
Alibaba-NLP
Embedding Model
OTHER
Good
599K downloads
Production-ready
Edge AI:
Mobile
Laptop
Server
1GB+ RAM
Mobile
Laptop
Server
Quick Summary
--- license: apache-2.
Device Compatibility
Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
1GB+ RAM
Code Examples
bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>bash
> pip install flash_attn
>tensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])text
pip install sentence-transformerstensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.tensor([ 2.1387, 2.4609, -1.6729])python
# Requires transformers>=4.48.0
from sentence_transformers import CrossEncoder
model = CrossEncoder(
"Alibaba-NLP/gte-reranker-modernbert-base",
automodel_args={"torch_dtype": "auto"},
)
pairs = [
["what is the capital of China?", "Beijing"],
["how to implement quick sort in python?","Introduction of quick sort"],
["how to implement quick sort in python?", "The weather is nice today"],
]
scores = model.predict(pairs)
print(scores)
# [0.8945664 0.9213594 0.15742092]
# NOTE: Sentence Transformers calls Softmax over the outputs by default, hence the scores are in [0, 1] range.Deploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.