Mlong T5 Tglobal Large
40
7
102 languages
license:apache-2.0
by
agemagician
Language Model
OTHER
New
40 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary
MLongT5 (transient-global attention, large-sized model) MLongT5 model pre-trained on Multi-language corpus.
Training Data Analysis
🔵 Good (6.0/10)
Researched training datasets used by Mlong T5 Tglobal Large with quality assessment
Specialized For
general
multilingual
Training Datasets (1)
c4
🔵 6/10
general
multilingual
Key Strengths
- •Scale and Accessibility: 750GB of publicly available, filtered text
- •Systematic Filtering: Documented heuristics enable reproducibility
- •Language Diversity: Despite English-only, captures diverse writing styles
Considerations
- •English-Only: Limits multilingual applications
- •Filtering Limitations: Offensive content and low-quality text remain despite filtering
Explore our comprehensive training dataset analysis
View All DatasetsCode Examples
S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))S-Denoisingpythontransformers
from transformers import LongT5ForConditionalGeneration, T5Tokenizer
import torch
model = LongT5ForConditionalGeneration.from_pretrained("agemagician/mlong-t5-tglobal-large", low_cpu_mem_usage=True, torch_dtype=torch.bfloat16).to("cuda")
tokenizer = T5Tokenizer.from_pretrained("agemagician/mlong-t5-tglobal-large")
input_string = "[S2S] Mr. Dursley was the director of a firm called Grunnings, which made drills. He was a big, solid man with a bald head. Mrs. Dursley was thin and blonde and more than the usual amount of neck, which came in very useful as she spent so much of her time craning over garden fences, spying on the neighbours. The Dursleys had a small son called Dudley and in their opinion there was no finer boy anywhere <extra_id_0>"
inputs = tokenizer(input_string, return_tensors="pt").input_ids.to("cuda")
outputs = model.generate(inputs, max_length=200)
print(tokenizer.decode(outputs[0]))Deploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.