cyberagent
layerd-birefnet
This repository contains the code and model weights for the matting module in [LayerD [ICCV'25]](https://arxiv.org/abs/2509.25134), a layer decomposition framework for graphic design images. The model in this repository is intended to be used as a part of the original LayerD github repository. Please visit https://github.com/CyberAgentAILab/LayerD for more information. The model architecture code is based on the BiRefNet repository. We thank the authors for releasing their high-quality matting model. This repository is intended for use with LayerD, so we recommend following the instructions in the LayerD repository. For reference, the original LayerD uses this model as follows: This repository is released under the Apache-2.0 license, the same as the LayerD repository. The original BiRefNet is released under the MIT license.
open-calm-small
open-calm-3b
Mistral-Nemo-Japanese-Instruct-2408
This is a Japanese continually pre-trained model based on mistralai/Mistral-Nemo-Instruct-2407. Make sure to update your transformers installation via `pip install --upgrade transformers`.
open-calm-7b
calm2-7b-chat
open-calm-large
calm2-7b
open-calm-1b
CAT-Translate-1.4b
DeepSeek R1 Distill Qwen 14B Japanese
This is a Japanese finetuned model based on deepseek-ai/DeepSeek-R1-Distill-Qwen-14B.
open-calm-medium
calm3-22b-chat
CyberAgentLM3 is a decoder-only language model pre-trained on 2.0 trillion tokens from scratch. CyberAgentLM3-Chat is a fine-tuned model specialized for dialogue use cases. Prompt Format CALM3-Chat uses ChatML as the prompt format. Model size: 22B Context length: 16384 Model type: Transformer-based Language Model Language(s): Japanese, English Developed by: CyberAgent, Inc. License: Apache-2.0
calm2-7b-chat-dpo-experimental
Llama-3.1-70B-Japanese-Instruct-2407
This is a Japanese continually pre-trained model based on meta-llama/Meta-Llama-3.1-70B-Instruct. Make sure to update your transformers installation via `pip install --upgrade transformers`.
xlm-roberta-large-jnli-jsick
DeepSeek R1 Distill Qwen 32B Japanese
This is a Japanese finetuned model based on deepseek-ai/DeepSeek-R1-Distill-Qwen-32B.