Cactus-Dream-Horror-12B

16
1
license:apache-2.0
by
EldritchLabs
Language Model
OTHER
12B params
New
16 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
27GB+ RAM
Mobile
Laptop
Server
Quick Summary

AI model with specialized capabilities.

Device Compatibility

Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
12GB+ RAM

Code Examples

Configurationyaml
architecture: MistralForCausalLM
base_model: B:/12B/models--p-e-w--Mistral-Nemo-Instruct-2407-heretic-noslop
models:
  - model: B:/12B/models--p-e-w--Mistral-Nemo-Instruct-2407-heretic-noslop
  - model: B:/12B/models--BeaverAI--MN-2407-DSK-QwQify-v0.1-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--crestf411--MN-Slush
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--D1rtyB1rd--Egregore-Alice-RP-NSFW-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--D1rtyB1rd--Looking-Glass-Alice-Thinking-NSFW-RP
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--Delta-Vector--Francois-PE-V2-Huali-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--Delta-Vector--Ohashi-NeMo-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--Delta-Vector--Rei-V3-KTO-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--Epiculous--Violet_Twilight-v0.2
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:/12B/models--elinas--Chronos-Gold-12B-1.0
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:\12B\models--inflatebot--MN-12B-Mag-Mell-R1
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:\12B\models--MarinaraSpaghetti--NemoMix-Unleashed-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:\12B\models--Sao10K--MN-12B-Vespa-x1
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:\12B\models--TheDrummer--Rocinante-12B-v1.1
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:\12B\models--TheDrummer--UnslopNemo-12B-v4.1
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
  - model: B:\12B\models--Vortex5--Crimson-Constellation-12B
    parameters:
      density: 0.9
      weight: 0.1
      epsilon: 0.099
# --lazy-unpickle --random-seed 420 --cuda --fix-mistral-regex
merge_method: della
parameters:  
  lambda: 1.0
  normalize: false
  int8_mask: false
dtype: float32
out_dtype: bfloat16
tokenizer:  
  source: "union"  
  tokens:  
    # Force ChatML EOS tokens  
    "<|im_start|>":  
      source: "B:/12B/models--D1rtyB1rd--Egregore-Alice-RP-NSFW-12B"  
      force: true  
    "<|im_end|>":  
      source: "B:/12B/models--D1rtyB1rd--Egregore-Alice-RP-NSFW-12B"  
      force: true  
    # Keep Mistral tokens  
    "[INST]":  
      source: "B:/12B/models--p-e-w--Mistral-Nemo-Instruct-2407-heretic-noslop"  
     #  source: "B:/12B/models--mistralai--Mistral-Nemo-Instruct-2407"    
     # The tokenizer system requires all models referenced in token configurations to be present in the merge's model list to build proper embedding permutations. 
    "[/INST]":  
      source: "B:/12B/models--p-e-w--Mistral-Nemo-Instruct-2407-heretic-noslop"  
    # Force </s> as fallback EOS  
    "</s>":  
      source: "B:/12B/models--p-e-w--Mistral-Nemo-Instruct-2407-heretic-noslop"  
      force: true

chat_template: "chatml"
name: 🌵 Cactus-Dream-Horror-12B

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.