Bielik-11B-v3.0-Instruct-GGUF

5.4K
13
license:apache-2.0
by
speakleash
Language Model
OTHER
11B params
New
5K downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
25GB+ RAM
Mobile
Laptop
Server
Quick Summary

AI model with specialized capabilities.

Device Compatibility

Mobile
4-6GB RAM
Laptop
16GB RAM
Server
GPU
Minimum Recommended
11GB+ RAM

Code Examples

Remeber to set low temperature for experimental models (1-3bits)text
FROM ./Bielik-11B-v3.0-Instruct.Q4_K_M.gguf

TEMPLATE """<s>{{ if .System }}<|start_header_id|>system<|end_header_id|>

{{ .System }}<|eot_id|>{{ end }}{{ if .Prompt }}<|start_header_id|>user<|end_header_id|>

{{ .Prompt }}<|eot_id|>{{ end }}<|start_header_id|>assistant<|end_header_id|>

{{ .Response }}<|eot_id|>"""

PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
PARAMETER stop "<|eot_id|>"

# Remeber to set low temperature for experimental models (1-3bits)
PARAMETER temperature 0.1
text
FROM ./Bielik-11B-v3.0-Instruct.Q8_0.gguf

TEMPLATE """{{- /* SYSTEM + TOOLS INJECTION */ -}}
{{- if or .System .Tools -}}
<|im_start|>system
{{- if .System }}
{{ .System }}
{{- end }}

{{- if .Tools }}
You are provided with tool signatures that you can use to assist with the user's query.
You do not have to use a tool if you can respond adequately without it.
Do not make assumptions about tool arguments. If required parameters are missing, ask a clarification question.

If you decide to invoke a tool, you MUST respond with ONLY valid JSON in the following format:
{"name":"<tool-name>","arguments":{...}}

Below is a list of tools you can invoke (JSON):
{{ .Tools }}
{{- end }}
<|im_end|>
{{- end }}

{{- /* MESSAGES */ -}}
{{- range $i, $_ := .Messages }}
<|im_start|>{{ .Role }}
{{ .Content }}<|im_end|>
{{- end }}

{{- /* GENERATION PROMPT */ -}}
<|im_start|>assistant"""

PARAMETER stop "<|start_header_id|>"
PARAMETER stop "<|end_header_id|>"
PARAMETER stop "<|eot_id|>"

PARAMETER temperature 0.1

Deploy This Model

Production-ready deployment in minutes

Together.ai

Instant API access to this model

Fastest API

Production-ready inference API. Start free, scale to millions.

Try Free API

Replicate

One-click model deployment

Easiest Setup

Run models in the cloud with simple API. No DevOps required.

Deploy Now

Disclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.