plamo-2-translate-gguf
2.8K
17
2 languages
Q4
—
by
mmnga
Language Model
OTHER
New
3K downloads
Early-stage
Edge AI:
Mobile
Laptop
Server
Unknown
Mobile
Laptop
Server
Quick Summary
plamo-2-translate-gguf pfnetさんが公開しているplamo-2-translateのggufフォーマット変換版です。 imatrixのデータはTFMC/imatrix-dataset-for-japanese-llmを使用して作成しました。
Code Examples
Usagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvUsagetextllama.cpp
git clone https://github.com/ggml-org/llama.cpp.git
cd llama.cpp
cmake -B build -DGGML_CUDA=ON
cmake --build build --config Release
build/bin/llama-cli -m 'plamo-2-translate-gguf' -n 128 -c 128 -p '<|plamo:op|>dataset\ntranslation\n\n<|plamo:op|>input lang=English\nWrite the text to be translated here.\n<|plamo:op|>output lang=Japanese' -no-cnvDeploy This Model
Production-ready deployment in minutes
Together.ai
Instant API access to this model
Production-ready inference API. Start free, scale to millions.
Try Free APIReplicate
One-click model deployment
Run models in the cloud with simple API. No DevOps required.
Deploy NowDisclosure: We may earn a commission from these partners. This helps keep LLMYourWay free.