wasmdashai
vits-ar-sa-huba-v2
vits-ar
vits-ar-sa-A
vits-ar-ye-sa
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: [More Information Needed] - Funded by [optional]: [More Information Needed] - Shared by [optional]: [More Information Needed] - Model type: [More Information Needed] - Language(s) (NLP): [More Information Needed] - License: [More Information Needed] - Finetuned from model [optional]: [More Information Needed] - Repository: [More Information Needed] - Paper [optional]: [More Information Needed] - Demo [optional]: [More Information Needed] Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: [More Information Needed] - Hours used: [More Information Needed] - Cloud Provider: [More Information Needed] - Compute Region: [More Information Needed] - Carbon Emitted: [More Information Needed]
Wasm 32B Instruct V1
wasm-32B-Instruct-V1 is a state-of-the-art instruction-tuned large language model developed by wasmdashai. With 32 billion parameters, this model is designed to deliver high-quality performance across a wide range of natural language processing and code-related tasks, 🚀 Introduction `wasm-32B-Instruct-V1` is built for instruction-following tasks and general-purpose reasoning. It leverages a powerful transformer architecture with optimized performance for large-scale generation tasks including: 🧠 Code generation and debugging 📚 Long-context understanding 🗣️ Multi-turn dialogue and reasoning 🔐 Privacy-conscious edge deployments (e.g., via WebAssembly) This model is fine-tuned on diverse instruction datasets and optimized for both human alignment and computational efficiency. Type: Causal Language Model (Decoder-only) Parameters: 32 Billion Training: Pretraining + Instruction Fine-tuning Architecture: Transformer with: Rotary Position Embeddings (RoPE) SwiGLU activation RMSNorm Attention with QKV bias Context Length: Up to 32,768 tokens Extended Context Option: Via `ropescaling` (supports up to 128K with YaRN) Format: Hugging Face Transformers-compatible To use this model, install the latest version of 🤗 `transformers` (>= 4.37.0 recommended): Here is a minimal example to load the model and generate a response: This model supports context lengths up to 32,768 tokens. For even longer inputs, you can enable YaRN scaling by modifying the `config.json` as follows: This is ideal for handling documents, logs, or multi-step reasoning tasks that exceed standard limits. We recommend using `vLLM` for efficient deployment, especially with large input lengths or real-time serving needs. Please note: `vLLM` currently supports static YaRN only. Avoid applying rope scaling unless necessary for long-context tasks, as it may impact performance on short inputs. For support, feedback, or collaboration inquiries, please contact:
vit-face-expression-v1
vits-eng-us-ljs
vits-en-v1
This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: [More Information Needed] - Funded by [optional]: [More Information Needed] - Shared by [optional]: [More Information Needed] - Model type: [More Information Needed] - Language(s) (NLP): [More Information Needed] - License: [More Information Needed] - Finetuned from model [optional]: [More Information Needed] - Repository: [More Information Needed] - Paper [optional]: [More Information Needed] - Demo [optional]: [More Information Needed] Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: [More Information Needed] - Hours used: [More Information Needed] - Cloud Provider: [More Information Needed] - Compute Region: [More Information Needed] - Carbon Emitted: [More Information Needed]