KansenSakura-Conflagration-RP-12b

24
6
license:apache-2.0
by
Retreatcost
Language Model
OTHER
12B params
New
24 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
27GB+ RAM
Mobile
Laptop
Server
Quick Summary

AI model with specialized capabilities.

Device Compatibility

Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
12GB+ RAM

Code Examples

yaml
merge_method: multislerp
  models:
    - model: yamatazen/FusionEngine-12B-Lorablated
      parameters:
        weight: [1.000, 1.000, 1.000, 1.000, 1.000, 0.968, 0.744, 0.256, 0.030, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.500, 0.500]
    - model: SakuraSynth-12B
      parameters:
        weight: [0.000, 0.000, 0.000, 0.000, 0.030, 0.256, 0.744, 0.968, 1.000, 1.000, 1.000, 1.000, 0.968, 0.744, 0.256, 0.030, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.500, 0.500]
    - model: Retreatcost/Chrysologus-12B
      parameters:
        weight: [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.030, 0.256, 0.744, 0.968, 1.000, 1.000, 1.000, 0.968, 0.744, 0.256, 0.030, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000]
    - model: FinalWind-12B
      parameters:
        weight: [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.030, 0.256, 0.744, 0.968, 1.000, 1.000, 1.000, 1.000, 0.968, 0.744, 0.256, 0.030, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000]
    - model: Darkness-12B
      parameters:
        weight: [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.030, 0.256, 0.744, 0.968, 1.000, 1.000, 1.000, 1.000, 0.968, 0.744, 0.256, 0.030, 0.000, 0.000, 0.000, 0.000]
    - model: Pyrographos-12B
      parameters:
        weight: [0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.000, 0.030, 0.256, 0.744, 0.968, 1.000, 1.000, 1.000, 0.000, 0.000]
  dtype: bfloat16
  parameters:
    normalize: true
  tokenizer_source: Retreatcost/KansenSakura-Erosion-RP-12b
yaml
merge_method: nearswap
base_model: KansenSakura-Conflagration-RP-12b-RC3
models:
  - model: KansenSakura-Conflagration-RP-12b-darken
parameters:
  t: [0.0000, 0.0000, 0.0000, 0.000275, 0.000550, 0.0000, 0.000275]
dtype: bfloat16
tokenizer_source: KansenSakura-Conflagration-RP-12b-RC3

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.