bigcode

56 models • 2 total models in database
Sort by:

gpt_bigcode-santacoder

37,274
26

starcoder2-7b

1. Model Summary 2. Use 3. Limitations 4. Training 5. License 6. Citation StarCoder2-7B model is a 7B parameter model trained on 17 programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 3.5+ trillion tokens. - Project Website: bigcode-project.org - Paper: Link - Point of Contact: [email protected] - Languages: 17 Programming languages The model was trained on GitHub code as well as additional selected data sources such as Arxiv and Wikipedia. As such it is not an instruction model and commands like "Write a function that computes the square root." do not work well. Generation Here are some examples to get started with the model. You can find a script for fine-tuning in StarCoder2's GitHub repository. First, make sure to install `transformers` from source: Running the model on CPU/GPU/multi GPU Using full precision Quantized Versions through `bitsandbytes` Using 8-bit precision (int8) The pretraining dataset of the model was filtered for permissive licenses and code with no license only. Nevertheless, the model can generate source code verbatim from the dataset. The code's license might require attribution and/or other specific requirements that must be respected. We provide a search index that lets you search through the pretraining data to identify where the generated code came from and apply the proper attribution to your code. The model has been trained on source code from 17 programming languages. The predominant language in source is English although other languages are also present. As such the model is capable of generating code snippets provided some context but the generated code is not guaranteed to work as intended. It can be inefficient and contain bugs or exploits. See the paper for an in-depth discussion of the model limitations. - Architecture: Transformer decoder with grouped-query and sliding window attention and Fill-in-the-Middle objective - Pretraining steps: 1 million - Pretraining tokens: 3.5+ trillion - Precision: bfloat16 The model is licensed under the BigCode OpenRAIL-M v1 license agreement. You can find the full agreement here.

NaNK
16,008
198

starpii

12,277
123

starcoder

11,535
2,905

starcoderbase

6,020
412

starcoderbase-1b

NaNK
5,139
87

tiny_starcoder_py

3,363
74

santacoder

2,679
334

starcoderbase-7b

NaNK
1,927
34

starcoderbase-3b

NaNK
1,166
25

starcoder2-15b-instruct-v0.1

NaNK
896
102

starencoder

283
54

santacoderpack

dataset:bigcode/commitpack-subset-cf
143
5

octocoder

dataset:bigcode/commitpackft
100
68

deepseekcoder-33b-codeqwen-align-subset

NaNK
llama
83
1

starcoder-co-format

80
0

starcoder-o

77
0

starcoder-cxso

77
0

starcoder-cxo

75
0

starcoder-co-manual

75
0

starcoder-co-target

74
2

starcoder-xo

70
0

octogeex

dataset:bigcode/commitpackft
63
22

santacoder-ldf

license:mit
63
2

santacoder-cf

license:mit
63
0

santacoder-fast-inference

38
4

astraios-1b-lora

NaNK
dataset:bigcode/guanaco-commits
8
0

astraios-lora

dataset:bigcode/guanaco-commits
7
0

astraios-ptuning

dataset:bigcode/guanaco-commits
7
0

astraios-parallel

dataset:bigcode/guanaco-commits
6
0

astraios-7b-lora

NaNK
dataset:bigcode/guanaco-commits
6
0

astraios-adapterp

dataset:bigcode/guanaco-commits
5
0

astraios-3b-lora

NaNK
dataset:bigcode/guanaco-commits
5
0

astraios-7b-ptuning

NaNK
dataset:bigcode/guanaco-commits
4
0

astraios-7b-adapterh

NaNK
dataset:bigcode/guanaco-commits
4
0

astraios-7b-parallel

NaNK
dataset:bigcode/guanaco-commits
4
0

astraios-3b-adapterh

NaNK
dataset:bigcode/guanaco-commits
4
0

astraios-ia3

dataset:bigcode/guanaco-commits
3
0

astraios-3b-ia3

NaNK
dataset:bigcode/guanaco-commits
3
0

astraios-3b-parallel

NaNK
dataset:bigcode/guanaco-commits
3
0

starcoderplus

2
219

astraios-adapterh

dataset:bigcode/guanaco-commits
1
0

astraios-7b-adapterp

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-7b-ia3

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-1b-ptuning

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-1b-parallel

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-1b-ia3

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-1b-adapterh

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-1b-adapterp

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-3b-ptuning

NaNK
dataset:bigcode/guanaco-commits
1
0

astraios-3b-adapterp

NaNK
dataset:bigcode/guanaco-commits
1
0

starcoder-megatron

0
6

starcoderplus-megatron

0
6

santacoder-megatron

0
3

starcoderbase-megatron

0
2

report

license:apache-2.0
0
2