GT4SD
Multitask Text And Chemistry T5 Base Augm
Multitask Text and Chemistry T5 : a multi-domain, multi-task language model to solve a wide range of tasks in both the chemical and natural language domains. Published by Christofidellis et al. Model Details: The Multitask Text and Chemistry T5 variant trained using t5-small as its pretrained based and the augmented dataset . Developers: Dimitrios Christofidellis, Giorgio Giannone, Jannis Born, Teodoro Laino and Matteo Manica from IBM Research and Ole Winther from Technical University of Denmark. Distributors: Model natively integrated into GT4SD. Model type: A Transformer-based language model that is trained on a multi-domain and a multi-task dataset by aggregating available datasets for the tasks of Forward reaction prediction, Retrosynthesis, Molecular captioning, Text-conditional de novo generation and Paragraph to actions. Information about training algorithms, parameters, fairness constraints or other applied approaches, and features: N.A. Paper or other resource for more information: The Multitask Text and Chemistry T5 Christofidellis et al.(2023) Where to send questions or comments about the model: Open an issue on GT4SD repository.