2021 | 2020 | 2019 | 2018 | 2017 | 2016 | 2015 | 2014 | 2013 | 2012 | 2011 | 2010 | 2009 | 2008 | 2007 | 2006 | 2005 | 2004 | 2003 | 2002 | 2001 | 2000 | 1999 | 1997

Crystal-graph attention networks for the prediction of stable materials

Authors: J. Schmidt, L. Pettersson, C. Verdozzi, S. Botti, and M.A.L. Marques

Ref.: Sci. Adv. 7, eabi7948 (2021)

Abstract: Graph neural networks for crystal structures typically use the atomic positions and the atomic species as input. Unfortunately, this information is not available when predicting new materials, for which the precise geometrical information is unknown. We circumvent this problem by replacing the precise bond distances with embeddings of graph distances. This allows our networks to be applied directly in high-throughput studies based on both composition and crystal structure prototype without using relaxed structures as input. To train these networks, we curate a dataset of over 2 million density-functional calculations of crystals with consistent calculation parameters. We apply the resulting model to the high-throughput search of 15 million tetragonal perovskites of composition ABCD2. As a result, we identify several thousand potentially stable compounds and demonstrate that transfer learning from the newly curated dataset reduces the required training data by 50%.

Citations: 0 (Google scholar)

DOI: 10.1126/sciadv.abi7948

Bibtex:

@article{Schmidt_2021,
	doi = {10.1126/sciadv.abi7948},
	url = {https://doi.org/10.1126%2Fsciadv.abi7948},
	year = 2021,
	month = {dec},
	publisher = {American Association for the Advancement of Science ({AAAS})},
	volume = {7},
	number = {49},
	author = {Jonathan Schmidt and Love Pettersson and Claudio Verdozzi and Silvana Botti and Miguel A. L. Marques},
	title = {Crystal graph attention networks for the prediction of stable materials},
	journal = {Science Advances}
}