MTVCraft

26
36
license:apache-2.0
by
BAAI
Video Model
OTHER
New
26 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary

MTVCraft An Open Veo3-style Audio-Video Generation Demo Pipeline | Installation | Models | Inference | Citation --> Sorry, your browser does not support the video tag.

Code Examples

bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
git clone https://github.com/baaivision/MTVCraft
cd MTVCraft
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
conda create -n mtv python=3.10
conda activate mtv
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
pip install -r requirements.txt
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
apt-get install ffmpeg
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
bash
cd $ProjectRootDir
pip install "huggingface_hub[cli]"
huggingface-cli download BAAI/MTVCraft --local-dir ./pretrained_models
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)
🎮 Run Inferencepython
# mtv/utils.py

qwen_model_name = "qwen-plus"  # or another model name you prefer
qwen_api_key = "YOUR_QWEN_API_KEY"  # replace with your actual Qwen API key

client = OpenAI(
    api_key=qwen_api_key,
    base_url="https://dashscope.aliyuncs.com/compatible-mode/v1",
)

elevenlabs = ElevenLabs(
    api_key="YOUR_ELEVENLABS_API_KEY",  # replace with your actual ElevenLabs API key
)

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.