nazimali

3 models • 2 total models in database
Sort by:

mistral-7b-v0.3-instruct-arabic

NaNK
license:apache-2.0
73
0

Mistral-Nemo-Kurdish-Instruct

ئەمە مۆدێلێکی پارامێتری 12B یە، وردکراوە لەسەر نازیماڵی/میستراڵ-نیمۆ-کوردی بۆ یەک داتا سێتی ڕێنمایی کوردی (کرمانجی). مەبەستم ئەوە بوو کە ئەمە بە هەردوو ڕێنووسی کوردی کرمانجی لاتینی و کوردی سۆرانی عەرەبی ڕابهێنم، بەڵام کاتی ڕاهێنان زۆر لەوە زیاتر بوو کە پێشبینی دەکرا. بۆیە بڕیارمدا 1 داتا سێتی کوردی کورمانجی تەواو بەکاربهێنم بۆ دەستپێکردن. سەیری ڕێکخستنی ڕاهێنانی فرە GPU دەکات بۆیە پێویست ناکات بە درێژایی ڕۆژ چاوەڕێی ئەنجامەکان بکەیت. دەتەوێت بە هەردوو ڕێنووسی عەرەبی کرمانجی و سۆرانی ڕاهێنانی پێبکەیت. This is a 12B parameter model, finetuned on `nazimali/Mistral-Nemo-Kurdish` for a single Kurdish (Kurmanji) instruction dataset. My intention was to train this with both Kurdish Kurmanji Latin script and Kurdish Sorani Arabic script, but training time was much longer than anticipated. So I decided to use 1 full Kurdish Kurmanji dataset to get started. Will look into a multi-GPU training setup so don't have to wait all day for results. Want to train it with both Kurmanji and Sorani Arabic script. Transformers `4.44.2` 1 NVIDIA A40 Duration 7h 41m 12s - `saillab/alpaca-kurdishkurmanji-cleaned` - Dataset number of rows: 52,002 - Filtered columns `instruction, output` - Must have at least 1 character - Must be less than 10,000 characters - Number of rows used for training: 41,559

license:apache-2.0
60
3

Mistral-Nemo-Kurdish

Continued pre-training on `mistralai/Mistral-Nemo-Instruct-2407` using the Kurdish wiki dataset with `unsloth`. This model should be further fine-tuned since the pre-training was to improve Kurdish language understanding. It's a quantized model using `bitsandbytes` so that it uses less memory. See bitsandbytes documentation. There isn't a standard or even a good Kurdish metric to evaluate the model (that I could find). Will make it my next project to create an evaluation so that there's a reproducible baseline for Kurdish. Will look into a multi-GPU training setup so don't have to wait all day for results. Would like to train it with both Kurmanji and Sorani. Should be fine-tuned further for a specific task. See instruction fine-tuned model nazimali/Mistral-Nemo-Kurdish-Instruct. Transformers `4.44.2` 1 NVIDIA A100 80GB PCIe Duration 6h 31m 4s - `nazimali/kurdish-wikipedia-articles` - Dataset number of rows: 63,076 - Filtered columns `title, text` - Must have at least 1 character - Number of rows used for training: 62,720 ```python trainingprompt = """Gotara Wikipedia Sernav: {}

NaNK
license:apache-2.0
0
3