Fu, Nihang and Wei, Lai and Song, Yuqi and Li, Qinyang and Xin, Rui and Omee, Sadman Sadeed and Dong, Rongzhi and Siriwardane, Edirisuriya M Dilanga and Hu, Jianjun (2023) Material transformers: deep learning language models for generative materials design. Machine Learning: Science and Technology, 4 (1). 015001. ISSN 2632-2153
Fu_2023_Mach._Learn.__Sci._Technol._4_015001.pdf - Published Version
Download (2MB)
Abstract
Pre-trained transformer language models (LMs) on large unlabeled corpus have produced state-of-the-art results in natural language processing, organic molecule design, and protein sequence generation. However, no such models have been applied to learn the composition patterns for the generative design of material compositions. Here we train a series of seven modern transformer models (GPT, GPT-2, GPT-Neo, GPT-J, BLMM, BART, and RoBERTa) for materials design using the expanded formulas of the ICSD, OQMD, and Materials Projects databases. Six different datasets with/out non-charge-neutral or EB samples are used to benchmark the generative design performances and uncover the biases of modern transformer models for the generative design of materials compositions. Our experiments show that the materials transformers based on causal LMs can generate chemically valid material compositions with as high as 97.61% to be charge neutral and 91.22% to be electronegativity balanced, which has more than six times higher enrichment compared to the baseline pseudo-random sampling algorithm. Our LMs also demonstrate high generation novelty and their potential in new materials discovery is proved by their capability to recover the leave-out materials. We also find that the properties of the generated compositions can be tailored by training the models with selected training sets such as high-bandgap samples. Our experiments also show that different models each have their own preference in terms of the properties of the generated samples and their running time complexity varies a lot. We have applied our materials transformers to discover a set of new materials as validated using density functional theory calculations. All our trained materials transformer models and code can be accessed freely at http://www.github.com/usccolumbia/MTransformer.
Item Type: | Article |
---|---|
Subjects: | Pustakas > Multidisciplinary |
Depositing User: | Unnamed user with email support@pustakas.com |
Date Deposited: | 11 Oct 2023 05:42 |
Last Modified: | 11 Oct 2023 05:42 |
URI: | http://archive.pcbmb.org/id/eprint/969 |