NYTK

38 models • 2 total models in database
Sort by:

named-entity-recognition-nerkor-hubert-hungarian

license:apache-2.0
23,732
3

sentiment-hts5-xlm-roberta-hungarian

license:mit
1,747
1

sentiment-hts5-hubert-hungarian

license:apache-2.0
1,170
1

PULI-GPTrio

Language support for Hungarian and English.

license:cc-by-nc-4.0
706
10

sentence-transformers-experimental-hubert-hungarian

license:apache-2.0
276
1

PULI-BERT-Large

license:cc-by-nc-4.0
228
3

sentiment-ohb3-hubert-hungarian

license:apache-2.0
168
1

text-generation-news-gpt2-small-hungarian

license:mit
137
5

translation-m2m100-1.2B-multi12-hungarian

NaNK
license:mit
78
1

translation-bart-128-en-hu

license:apache-2.0
49
0

PULI-GPT-2

license:cc-by-nc-4.0
47
1

translation-bart-hu-en

license:apache-2.0
36
0

PULI-GPT-3SX

license:cc-by-nc-4.0
33
13

summarization-hi-mbart-large-50-hungarian

license:mit
28
1

text-generation-poem-petofi-gpt2-small-hungarian

license:mit
24
2

PULI-LlumiX-Llama-3.1

llama
22
10

translation-nllb-200-3.3B-multi12-hungarian

NaNK
license:cc-by-nc-4.0
21
4

PULI-LlumiX-32K

License: llama2 Language: hu

llama
19
11

reading-comprehension-hurc-hubert-hungarian

license:apache-2.0
17
2

morphological-generator-emmorph-mt5-hungarian

license:apache-2.0
17
0

hucola-puli-bert-large-hungarian

license:cc-by-nc-4.0
16
1

morphological-generator-ud-mt5-hungarian

license:apache-2.0
13
0

hurte-puli-bert-large-hungarian

license:cc-by-nc-4.0
11
0

PULI-HuBA-mamba-130M

PULI-HuBA 130M is a monolingual Hungarian foundation model based on the Mamba configuration. (https://huggingface.co/state-spaces/mamba-130m-hf) Parameters: MambaForCausalLM( (backbone): MambaModel( (embeddings): Embedding(52000, 768) (layers): ModuleList( (0-23): 24 x MambaBlock( (norm): MambaRMSNorm(768, eps=1e-05) (mixer): MambaMixer( (conv1d): Conv1d(1536, 1536, kernelsize=(4,), stride=(1,), padding=(3,), groups=1536) (act): SiLU() (inproj): Linear(infeatures=768, outfeatures=3072, bias=False) (xproj): Linear(infeatures=1536, outfeatures=80, bias=False) (dtproj): Linear(infeatures=48, outfeatures=1536, bias=True) (outproj): Linear(infeatures=1536, outfeatures=768, bias=False) ) ) ) (normf): MambaRMSNorm(768, eps=1e-05) ) (lmhead): Linear(infeatures=768, outfeatures=52000, bias=False) ) The model was trained on a ~3.48B-token, toxic-filtered, deduplicated, and semantically segmented dataset. License: Apache 2.0 Hardware: 4 × NVIDIA A100 (80GB) GPUs Year of training: 2024 Input/output: Text only Parameter count: 130 million Available model size: Single variant Data type: float32 Batch size: 10 per GPU Learning rate: 3e-4 Reference: GitHub issue Potential for biased, incorrect, or harmful content generation. To generate text using this model with Hugging Face's `pipeline`, use the following Python code: If you have any questions, please contact me: [email protected] or [email protected]

license:apache-2.0
10
3

PULI-Trio-Q

- Trained with LLaMA-Factory github - The Qwen2.5 7B Instruct model were continual pretrained on Hungarian dataset - Hungarian (8.08 billion words): documents (763K) that exceed 5000 words in length + Hungarian Wikipedia - English: Long Context QA (2 billion words), BookSum (78 million words) - Chinese (3 billion Chinese characters): Wudao - The training was completed using a Hungarian-only dataset: - 626 million Hungarian words (1 epoch): Hungarian Wikipedia + News articles Citation If you use this model, please cite the following paper:

license:apache-2.0
9
3

sentiment-ohb3-xlm-roberta-hungarian

license:mit
9
1

husst-puli-bert-large-hungarian

license:cc-by-nc-4.0
8
0

summarization-hi-mt5-base-hungarian

license:apache-2.0
6
0

translation-mt5-small-128-en-hu

license:apache-2.0
5
2

translation-bart-en-hu

license:apache-2.0
5
1

summarization-hi-bart-hungarian

license:apache-2.0
4
0

translation-marianmt-en-hu

license:gpl-3.0
3
1

summarization-hi-bart-base-1024-hungarian

license:apache-2.0
3
0

sentiment-hts2-hubert-hungarian

license:apache-2.0
2
1

summarization-nol-bart-hungarian

license:apache-2.0
2
0

ocr-cleaning-mt5-base-hungarian

license:apache-2.0
2
0

sentiment-hts2-xlm-roberta-hungarian

license:mit
1
0

quality-estimation-huq-xlm-roberta-en-hu

license:mit
1
0