t5-small
2.5M
504
512
Small context
6 languages
license:apache-2.0
by
google-t5
Language Model
OTHER
High
2.5M downloads
Battle-tested
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary
1. Model Details 2. Uses 3. Bias, Risks, and Limitations 4. Training Details 5. Evaluation 6. Environmental Impact 7. Citation 8. Model Card Authors 9. How To G...
Training Data Analysis
🔵 Good (6.0/10)
Researched training datasets used by t5-small with quality assessment
Specialized For
general
multilingual
Training Datasets (1)
c4
🔵 6/10
general
multilingual
Key Strengths
- •Scale and Accessibility: 750GB of publicly available, filtered text
- •Systematic Filtering: Documented heuristics enable reproducibility
- •Language Diversity: Despite English-only, captures diverse writing styles
Considerations
- •English-Only: Limits multilingual applications
- •Filtering Limitations: Offensive content and low-quality text remain despite filtering
Explore our comprehensive training dataset analysis
View All DatasetsCode Examples
How to Get Started with the Modelpythontransformers
from transformers import T5Tokenizer, T5Model
tokenizer = T5Tokenizer.from_pretrained("t5-small")
model = T5Model.from_pretrained("t5-small")
input_ids = tokenizer(
"Studies have been shown that owning a dog is good for you", return_tensors="pt"
).input_ids # Batch size 1
decoder_input_ids = tokenizer("Studies show that", return_tensors="pt").input_ids # Batch size 1
# forward pass
outputs = model(input_ids=input_ids, decoder_input_ids=decoder_input_ids)
last_hidden_states = outputs.last_hidden_stateDeploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.