FlofloB

24 models • 27 total models in database
Sort by:

100k_fineweb_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit

Core purpose is text generation inference. Base model is unsloth/qwen2.5-0.5b-instruct-bnb-4bit.

NaNK
license:apache-2.0
105
2

83k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit

Language model with Apache 2.0 license.

NaNK
license:apache-2.0
7
2

10k_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit

Core purpose is text generation inference using the base model unsloth/phi-3-mini-4k-instruct-bnb-4bit.

NaNK
license:apache-2.0
6
1

10k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit

Core purpose is text generation inference using the base model unsloth/qwen2.5-0.5b-instruct-bnb-4bit.

NaNK
license:apache-2.0
5
1

smollm2-135M_pretrained_1400k_fineweb

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1400k fineweb data. It uses the transformers library and is licensed under Apache 2.0.

llama
5
0

test_continued_pretraining_Phi-3-mini-4k-instruct_Unsloth_merged_16bit

Text generation inference model based on unsloth/phi-3-mini-4k-instruct-bnb-4bit.

NaNK
license:apache-2.0
4
2

40k_continued_pretraining_Qwen2.5-0.5B-Instruct_Unsloth_merged_16bit

Core purpose is text generation inference using the base model unsloth/qwen2.5-0.5b-instruct-bnb-4bit.

NaNK
license:apache-2.0
4
1

smollm2-135M_pretrained_400k_fineweb

This model is based on the FlofloB/smollm2 architecture and is pretrained with 400k fineweb data. It uses the transformers library and is licensed under Apache 2.0.

llama
3
0

smollm2-135M_pretrained_1000k_fineweb_uncovai_selected

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1000K fineweb data under the Apache 2.0 license.

llama
3
0

smollm2-135M_pretrained_1000k_fineweb

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1000k fineweb data. It uses the transformers library and is licensed under Apache 2.0. The base model is FlofloB/smollm2-135M_pretrained_800k_fineweb.

llama
2
0

smollm2-135M_pretrained_1200k_fineweb

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1200k fineweb data. It uses the transformers library and is licensed under Apache 2.0.

llama
2
0

smollm2-135M_pretrained_1400k_fineweb_uncovai_selected

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1400K data from Fineweb Uncovai. It uses the transformers library and is licensed under Apache 2.0.

llama
2
0

smollm2-135M_pretrained_600k_fineweb_uncovai_human_removed

Pretrained model based on FlofloB/smollm2-135M with 600K fine-tuning on fineweb data, human-removed content. It uses the transformers library and is licensed under Apache 2.0.

llama
2
0

smollm2_pretrained_200k_fineweb

Pretrained model based on the HuggingFaceTB/SmolLM2-135M architecture. It is licensed under Apache 2.0 and utilizes the transformers library.

llama
1
1

smollm2-135M_pretrained_200k_fineweb_uncovai_selected

This model is based on the HuggingFaceTB/SmolLM2-135M architecture and is licensed under Apache 2.0. It utilizes the transformers library.

NaNK
llama
1
1

smollm2-135M_pretrained_400k_fineweb_uncovai_selected

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 400,000 fine-tuned examples from the fineweb dataset. It is licensed under Apache 2.0.

llama
1
1

smollm2-135M_pretrained_600k_fineweb_uncovai_selected

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 600K fine-tuning on the Uncovai dataset. It uses the transformers library and is licensed under Apache 2.0.

NaNK
llama
1
0

smollm2-135M_pretrained_800k_fineweb

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 800k fineweb data. It uses the transformers library and is licensed under Apache 2.0. The base model is FlofloB/smollm2-135M_pretrained_600k_fineweb.

llama
1
0

smollm2-135M_pretrained_1200k_fineweb_uncovai_selected

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1200K fineweb uncovai selected data. It uses the transformers library and is licensed under Apache 2.0. The base model is FlofloB/smollm2-135M pretrained with 1000K fineweb uncovai selected.

llama
1
0

smollm2-135M_pretrained_200k_fineweb_uncovai_human_removed

This model is based on the HuggingFaceTB/SmolLM2-135M architecture and is licensed under Apache-2.0.

llama
1
0

smollm2-135M_pretrained_400k_fineweb_uncovai_human_removed

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 400,000 steps on fineweb data, with human-removed content. It is licensed under Apache 2.0 and utilizes the transformers library.

llama
1
0

smollm2-135M_pretrained_800k_fineweb_uncovai_human_removed

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 800,000 fine-tuned examples from the Fineweb dataset, with human-removed content. It is licensed under Apache 2.0 and utilizes the transformers library.

llama
1
0

smollm2-135M_pretrained_1000k_fineweb_uncovai_human_removed

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1000K fineweb data, with human-removed content. It uses the transformers library and is licensed under Apache 2.0.

llama
1
0

smollm2-135M_pretrained_1200k_fineweb_uncovai_human_removed

This model is based on the FlofloB/smollm2-135M architecture and is pretrained with 1200k fineweb data, with human-removed content. It uses the transformers library and is licensed under Apache 2.0.

llama
1
0