Preface The following article is Open access

Preface

Published 30 October 2019 © 2019 Commonwealth Scientific and Industrial Research Organisation.
, , Focus on Advanced Material Modelling, Machine Learning and Multiscale Simulation Citation Amanda S Barnard 2020 J. Phys. Mater. 3 010301 DOI 10.1088/2515-7639/ab460c

2515-7639/3/1/010301

Export citation and abstract BibTeX RIS

Original content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.

Advanced materials modelling is experiencing a renaissance. Numerical modelling and computational simulation have been a part of our scientific toolkit for decades, with methodological advances at various length scales being recognised as some of the greatest inventions of our time. For example, since the invention of modern computers in 1975 there have been three Nobel Prizes awarded for computational chemistry, and five further Nobel Prizes in chemistry and physics for the underlying theories that support our computation.

Notably in 2011 density functional theory and quantum chemistry were recognised, and are among the most important methods used today to study the structure and properties of matter. In this issue Liu et al [1] use these ab initio methods and semiclassical Boltzmann transport theory to predict the figure of merit of KZnP for thermoelectric applications. Simulations of these types have become such an important part of materials research that high-throughput searches are becoming commonplace, and new workflow tools are needed to optimised the use of high performance computational infrastructure and focus research on the most important materials. This is discussed by Dunn et al [2] in their presentation of the new Rocketsled library. Computational results are now so plentiful that repositories such as NOMAD, which is currently the largest collection of computational materials data in world, are needed to capture all of the data, metadata and analysis tools that is being generated, as discussed by Draxl et al [3].

But recently the computational landscape has been changing. Machine learning is another mature field, which has been a part of theoretical computer science for many decades. In recent years however, materials scientists have become attracted to machine learning algorithms for their ability to handle complexity and large configuration spaces, and extract high-dimensional structure/property relationships that would be otherwise obscured, as explained in the review by Schleder et al [4]. This Focus Issue contains interesting examples that span methods and materials, including the prediction of magnetisation reversal by Exl et al [5] in their study of the microstructure of Nd2Fe14B using random forest classification; the prediction of Curie temperatures of new materials as shown by Nguyen et al [6] in their study of rare earth transition metal binary alloys using kernel ridge regression and hierarchical clustering; and the prediction of interfacial properties, as shown by Oda et al [7] in their study of grain boundaries using support vector machines. These approaches can also utilise spectral data, as shown by Kiyohara et al [8] who use forwardfeed neural networks to directly predict six properties of silicon oxides based on simulated core-loss spectra; and can be used to directly visualise multi-dimensional structure/property relationships, as shown by Sun et al [9] in their study of silver and platinum nanoparticles using unsupervised t-distributed stochastic neighbour embedding and self-organising maps. The introduction of machine learning to materials science has even led to new algorithms more attuned to the unique needs of this domain, as described by Ouyang et al [10] in their paper on the new multi-tasking sure-independence screening and sparsifying operator method designed to handle sparse data sets where not all properties are known for all materials.

Most interestingly, we are also seeing a convergence of these methods and the blurring of computational and discipline boundaries. Fowler et al [11] shows how to use data-driven methods to manage the uncertainties in DFT, using a non-Bayesian approach to estimate the uncertainty and identify accurate or inaccurate predictions of the electron density; and Brunton et al [12] review the development of data-driven multiscale methods for predicting the macroscopic properties of heterogeneous microscopic materials, a topic that expands upon work in multi-scale simulation that received the Nobel Prize in Chemistry in 2013. Only time will tell what further advances we can expect from material modelling, machine learning and multiscale simulation.

Please wait… references are loading.
10.1088/2515-7639/ab460c