kugelaudio-0-open
7.0K
85
license:mit
by
kugelaudio
Audio Model
OTHER
New
7K downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary
AI model with specialized capabilities.
Code Examples
Quick Startpythonpytorch
from kugelaudio_open import (
KugelAudioForConditionalGenerationInference,
KugelAudioProcessor,
)
import torch
# Load model
device = "cuda" if torch.cuda.is_available() else "cpu"
model = KugelAudioForConditionalGenerationInference.from_pretrained(
"kugelaudio/kugelaudio-0-open",
torch_dtype=torch.bfloat16,
).to(device)
model.eval()
processor = KugelAudioProcessor.from_pretrained("kugelaudio/kugelaudio-0-open")
# Strip encoder weights to save VRAM (only decoders needed for inference)
model.model.strip_encoders()
# See available voices
print(processor.get_available_voices()) # ["default", "warm", "clear"]
# Generate speech with a specific voice
inputs = processor(text="Hallo Welt! Das ist KugelAudio.", voice="default", return_tensors="pt")
inputs = {k: v.to(device) if isinstance(v, torch.Tensor) else v for k, v in inputs.items()}
with torch.no_grad():
outputs = model.generate(**inputs, cfg_scale=3.0)
# Save audio
processor.save_audio(outputs.speech_outputs[0], "output.wav")Voicespython
# List available voices
voices = processor.get_available_voices()
print(voices) # ["default", "warm", "clear"]
# Generate with a specific voice
inputs = processor(text="Hallo, das ist eine warme Stimme!", voice="warm", return_tensors="pt")
inputs = {k: v.to(device) if isinstance(v, torch.Tensor) else v for k, v in inputs.items()}
with torch.no_grad():
outputs = model.generate(**inputs, cfg_scale=3.0)
processor.save_audio(outputs.speech_outputs[0], "warm_voice_output.wav")Deploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.