Magnolia-v3-medis-remix-12B

2
1
12.0B
license:apache-2.0
by
grimjim
Language Model
OTHER
12B params
New
2 downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
27GB+ RAM
Mobile
Laptop
Server
Quick Summary

AI model with specialized capabilities.

Device Compatibility

Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
12GB+ RAM

Code Examples

Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Chat Templatetext
{{ bos_token }}
{% for message in messages %}
    {% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}
        {{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}
    {% endif %}
    {% if message['role'] == 'user' %}
        {{ '[INST]' + message['content'] + '[/INST]' }}
    {% elif message['role'] == 'assistant' %}
        {{ message['content'] + eos_token }}
    {% else %}
        {{ raise_exception('Only user and assistant roles are supported!') }}
    {% endif %}
{% endfor %}
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05
Configurationyaml
base_model: grimjim/mistralai-Mistral-Nemo-Base-2407
dtype: bfloat16
merge_method: task_arithmetic
parameters:
  normalize: true
slices:
- sources:
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Base-2407
  - layer_range: [0, 40]
    model: grimjim/mistralai-Mistral-Nemo-Instruct-2407
    parameters:
      weight: 0.9
  - layer_range: [0, 40]
    model: grimjim/magnum-consolidatum-v1-12b
    parameters:
      weight: 0.1
  - layer_range: [0, 40]
    model: grimjim/magnum-twilight-12b
    parameters:
      weight: 0.001
  - layer_range: [0, 40]
    model: exafluence/EXF-Medistral-Nemo-12B
    parameters:
      weight: 0.000001
  - layer_range: [0, 40]
    model: nbeerbower/Mistral-Nemo-Prism-12B
    parameters:
      weight: 0.05

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.