Modelling and understanding battery materials with machine-learning-driven atomistic simulations

The realistic computer modelling of battery materials is an important research goal, with open questions ranging from atomic-scale structure and dynamics to macroscopic phenomena. Quantum-mechanical methods offer high accuracy and predictive power in small-scale atomistic simulations, but they quickly reach their limits when complex electrochemical systems are to be studied—for example, when structural disorder or even fully amorphous phases are present, or when reactions take place at the interface between electrodes and electrolytes. In this Perspective, it is argued that emerging machine learning based interatomic potentials are promising tools for studying battery materials on the atomistic and nanometre length scales, affording quantum-mechanical accuracy yet being many orders of magnitude faster, and thereby extending the capabilities of current battery modelling methodology. Initial applications to solid-state electrolyte and anode materials in lithium-ion batteries are highlighted, and future directions and possible synergies with experiments are discussed.


Introduction
Batteries are complex, highly specialised electrochemical reaction systems with multiple interdependent parts. To a large extent, their functionality originates on the atomic scale, and it is therefore important to correctly determine the atomistic structure and reactivity of all materials involved. Today, advanced experimental methods can give detailed structural insight into batteries during operation (operando); this includes x-ray and neutron diffraction, electron microscopy down to atomistic resolution, as well as nuclear magnetic resonance (NMR) and x-ray spectroscopy which both probe the local atomic environments [1][2][3][4]. As is often the case with structurally complex solids, each of these techniques adds a specific piece of the puzzle-but only their combination can provide the full picture, requiring time and expert knowledge. Even if the structure of a given battery material is accurately known, it is still more challenging to establish the connections between primarily structural aspects (e.g. the local environments of Li or Na ions intercalated in carbon-based anodes), the resulting electrochemical properties (manifested in the voltage profile during charging and discharging), and the material's macroscopic behaviour (fatigue, failure, etc). And in addition to the materials themselves, their solid-solid or solid-electrolyte interfaces are just as important, because they can have a direct impact on macroscopic properties-say, in the development of all-solid-state batteries, where a loss of physical contact between different components due to structural inhomogeneity can lead to a loss of performance [5][6][7].
To complement the wide range of experimental characterisation tools in battery materials research, the atomistic structure and reactivity of these materials have been frequently studied with the help of computer simulations. The current primary methods of choice are based on density-functional theory (DFT) for simulation cells containing a few hundred or at most a few thousand atoms (corresponding to the Ångström and few-nm length scales), and on empirically fitted interatomic potential models ('force fields') for atomistic simulations of substantially larger systems. A general introduction to atomistic battery materials modelling may be found in Ref. [8], and a very recent overview has been provided, for example, in Ref. [9]. The aim of such computational studies is two-fold: first, to develop an accurate structural model of a given cathode, electrolyte, or anode material (with 'accurate' here taken to mean 'consistent with available experimental data' , including static and dynamic properties); second, to use this structural knowledge to understand the results of experiments, such as the Li-ion conductivity of a solid-state electrolyte or the voltage profile during charging and discharging of a candidate battery. Quantum-mechanically based DFT methods largely offer the required accuracy, to within a few kilojoules per mole or better, and they are predictive, in that they may suggest new experiments and research directions. Prominent examples include DFT predictions of a variety of previously unknown, stable and metastable crystal structures, which were subsequently realised by experiment in many cases [10]. Yet, the computational cost of DFT is so large that one remains restricted to small simulation cells and often to idealised model systems. For example, the intercalation mechanisms of Li and Na ions in graphite, and the question why Na intercalates so poorly, have been widely studied by DFT computations on small systems, including an in-depth analysis of the various competing energetic contributions [11]. In contrast, similar studies of disordered hard-carbon anodes require much larger model systems owing to the lack of long-range order, and therefore such simulations are still in their infancy.
The aim of this short Perspective is to highlight an emerging, complementary approach in atomistic materials modelling which is thought to be of interest for battery research: namely, the creation of fast and accurate interatomic potentials by machine learning (ML) from quantum-mechanical reference data. With a strict focus on batteries, this work naturally cannot cover all the exciting developments in the field more generally (nor can it make reference to all of the growing body of literature), and the interested reader is referred to recent reviews of ML potentials [12][13][14] and to more general overviews concerned with ML for materials science [15][16][17] and with materials design [18]. There are also recent reviews of diverse emerging ML methods being increasingly applied to energy materials, including, but not restricted to, battery research [19][20][21][22]. It is hoped that the present work will highlight capabilities of ML-driven interatomic potentials (and possible future directions) to researchers in the wider field of energy storage, including both experimentally and computationally focused readers-and that it might thereby encourage new applications of ML potentials and enable further insight into the structures and properties of battery materials.

What are ML potentials?
Machine learning aims to extract information from (very) large datasets, and many refer to it as the 'fourth paradigm' of science, following on from the earlier empirical, model-based, and computational paradigms [23]. Whilst the earliest ML approaches date back to the middle of the 20th century, the availability of immense amounts of data has led to a spectacular rise in interest in the recent years. ML-based interatomic potentials are an example of a supervised learning problem, more specifically, an applied regression task: given a dataset of accurately computed energies and forces for selected points on a potential-energy surface (PES), make the best possible fit to these data, without assuming a specific functional form of the PES. The last point marks the principal difference compared to existing, empirically fitted interatomic potential models which have been used for decades. Empirical potentials rely on simple physical models (Coulomb, Lennard-Jones, Buckingham, …) which parameterise the PES as a function of bond lengths, angles, and so on; they are therefore very fast to evaluate, and have been widely used to run large-scale molecular-dynamics (MD) simulations even for complex battery chemistries [8]. In contrast, ML potentials do not pre-define the shape of the interaction-so that even the simplest features of the PES (say, the repulsion between atoms at short distance) must be 'learned' from suitable reference data or otherwise included in the model construction [24]. A more general view of this emerging field was given, e.g. in a recent overview article which includes applications of ML potentials in diverse areas of materials research [13]. Figure 1 provides an overview of the most central concepts [13], using here the example of the aforementioned carbon anode materials: 'learning' from small unit cells which sample the PES at properly chosen points, such as a distorted graphite structure and more disordered supercells based on it, ML potentials enable the study of much larger systems-here, a 1000-atom cell, which describes a material that resembles graphite but is strongly disordered [25]. Indeed, systems with tens of thousands of atoms are within reach for ML potentials, and their simulation-cell lengths approach the size of nanostructures in devices: experiment and highly accurate simulation are now looking at the same thing, in terms of length scales [26].
The construction of the reference database, highlighted in yellow in figure 1, is one of the three central tasks (and often, one of the barriers) in developing any ML potential, alongside the representation of atomic structure and the regression itself [13]. So far, reference databases have been widely built by hand: for example, running MD simulations with evolving versions of a potential and 'feeding back' structural snapshots for single-point energy and force evaluation, thereby gradually growing the database [29,30]. More recent, ongoing research in the community is concerned with the question how such databases might be built and optimised in a more automated fashion [31][32][33][34], which would promise an increased usefulness The general idea and the main required components. First, a reference database of small structural models (here, represented by a distorted graphite structure) is constructed and single-point quantum-mechanical computations are carried out; the database is then typically grown and curated over time. Having sampled the high-dimensional potential-energy surface (PES) at selected points and represented the atomic environments in a suitable mathematical form, the fit itself is carried out-in most cases, this is based on the assumption that the total energy, E, can be decomposed into atomic contributions, ε j [27,28]. This way, much larger systems can be studied, exemplified here by a disordered graphite-like structure containing ≈1000 atoms (drawn with data from Ref. [25]). Bottom: Three major families of regression methods that are currently used for fitting ML potentials, as described in the text. Adapted from Ref. [13] with permission. Original figure copyright © 2019 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
(through wider availability of high-quality databases) to many fields of applied research. The challenges of database construction, with a specific focus on energy materials including batteries, have been emphasised in a recent Topical Review in the present journal [20]: highly complex interfaces must be represented by small structural models which nonetheless need to adequately cover all parts of the 'real-world' system, and an improved standardisation of fitting databases and protocols would benefit developments across the field [20].
The three families of ML regression methods that are currently most widely used for fitting potentials are sketched in the lower three panels in figure 1. Artificial neural networks (ANNs) [12] are sets of mathematical functions that mimic the network of neurons in the brain; the 'training' of an ANN model consists of adjusting the weights, w, between these functions by adapting them to a set of known data (the reference database). In 2007, Behler and Parrinello showed how ANN potentials for high-dimensional (condensed-phase) systems can be constructed, by using a suitable representation of the local atomic structure and by writing the total energy as a sum of 'machine-learned' local energies [27], the latter approximation now being central to other fitting methods as well. Today, a wide variety of developments based on ANNs has been reported, including, e.g. a broadly applicable ML potential for organic molecules [35] or a recent class of 'deep' neural network models [36]. The second family of methods used to make ML potentials is based on kernel (similarity) functions, which compare given known and new atomic environments. In 2010, Bartók et al introduced the Gaussian approximation potential (GAP) [28] framework, which utilises Gaussian process regression and a suitable choice of representation [37]; other methodological developments related to kernel methods include gradient-domain ML [38] and automated 'on-the-fly' potentials coupled to DFT-MD [39]. The third family of methods is based on linear fitting: the idea here is to express an atomic structure by appropriately chosen basis functions, B, which enter the energy model using (only) linear terms; this includes the spectral neighbor analysis potential (SNAP) [40] and moment tensor potential (MTP) [41] approaches. All three classes of fitting methods sketched in figure 1 can yield potentials with high performance, as shown in a recent benchmark study across different implementations [42]; all three are under continuous improvement, each in specific ways. All three classes of methods and derived ML potentials have begun to be used for battery-related studies very recently, and these emerging applications are discussed in the following section.
ML potentials are on the verge of being widely used-and openly available source codes and potential parameters, as well as integration with existing simulation software, are all poised to accelerate their uptake in the community. A notable example are interfaces to the widely used (and similarly freely available) LAMMPS MD code [43], which have now been provided for several classes of ML potentials: therefore, an experienced user might need to do no more than swap out a few lines of input to run their existing simulation setup with a new ML potential. It is emphasised, again, that these potentials are still much more computationally costly than their empirically fitted counterparts. This trade-off was recently discussed by de Tomas et al [44] in a systematic benchmark study of carbon potentials (focusing on disordered, partly graphitised forms, which are relevant to battery research as well): among the methods tested, an ML potential showed high performance but was also by far the most expensive [44], and a good empirical potential may be a much faster option in many cases, as shown by previous successful applications of such potentials to graphitic-like carbon (see, e.g. [45]). Indeed, battery research includes larger-scale simulation tasks which will be out of reach for many currently available ML potentials, but accessible to fast empirical models. On the other hand, there are complex atomic-scale problems (e.g. bond breaking and making during thin-film growth [46]) for which an ML potential might well be worth the additional cost. It is about the right choice of computational tools, as always.

What has been done?
The ML-driven atomistic modelling of battery materials is at a very early stage, and it has largely focused on selected prototype materials so far. Figure 2(a) provides a (highly simplified) overview of different components in a battery. Blue circles indicate research directions which have already been pursued with ML potentials; green circles show areas where, as of this writing and to the author's best knowledge, no directly battery-related applications of ML potentials have been published, but where the required methodology or large parts of it are principally available. Walking from left to right through the schematic in figure 2(a), anode materials have been studied with several different types of ML potential frameworks; of particular interest here are those materials which have too complex structures to be easily described by DFT, such as disordered carbon-and silicon-based anodes. Whilst previous ML-driven simulations have focused on bulk anode phases, e.g. Li x Si [47,48], the associated electrode-electrolyte interfaces are similarly important, as mentioned, e.g. by Huang et al [49], and accurate atomistic models of such interfaces are often plainly out of reach for DFT. For the electrolyte itself, there are two primary classes, liquid and solid-of which the latter garnered some attention from the ML potential community, as discussed below. Finally, the cathode materials are typically strongly ionic solids, ranging in structural complexity from the iconic LiCoO 2 [50] to nanostructured metal oxides [51], all of which pose challenges of their own to ML potential development (see, e.g. Ref. [52] for an early study of a prototypical metal oxide). The present section will focus on currently reported applications to battery materials (i.e. on the blue circles in figure 2(a)), whereas the following section will give an outlook on possible future directions. Figure 2(b) shows an example of the model solid-state electrolyte, Li 3 PO 4 , in which the Li + ions move between rigid [PO 4 ] 3− tetrahedra, representative of many more complex electrolyte materials. The material was studied using MD simulations with a neural-network type potential [53], illustrating how ML potentials enable the DFT-accurate treatment of much larger simulation cells (often large enough that the prediction of a property of interest does not change substantially when the cell size is increased further [55]). This example is also representative of how ML potentials can improve not only the length but also the simulation time scales: sometimes, the challenge is not in the mere number of atoms, but in being able to perform many millions (and more) of individual MD steps-e.g. to create high-quality amorphous silicon structures by simulated slow quenching from the melt [56], or to obtain accurate diffusion coefficients in solid-state electrolytes, as studied in Ref. [53] for Li 3 PO 4 . The authors also provide an example for how simulation results are typically validated in the ML potential community: by comparison to small-scale DFT simulations, for example, in terms of computed radial distribution functions (which provide a simplified but instructive 'fingerprint' for the local structure) [53].
A similar validation strategy, although with a different ML fitting framework and application case, was followed by Fujikake et al [57] who introduced a GAP model for Li diffusion in disordered carbon structures. In that case, the emphasis was on how one can build difference potentials by subtracting previously known interactions (those between carbon atoms) and fitting a separate ML model for the energy and force differences resulting from Li intercalation into the host structure. That said, the validation of ML potentials, not only in terms of fitting accuracy and computed observables, but also in terms of how resilient they are to failure, remains an outstanding challenge-especially because of the absence of a physically motivated functional form (as would be present in empirical interatomic potentials).

b) Simulation cells representing structurally disordered
Li3PO4, indicating the step up in system size that is possible by moving from DFT to ML potentials [53]. Reprinted from [53], with the permission of AIP Publishing. (c) Computational prediction of Li-ion diffusivity in selected candidate cathode coating (or electrolyte) materials [54], each indicated by a different colour. The plot compares the authors' results of DFT-driven ab initio MD (open symbols) to those of an MTP-driven 'learn on the fly' scheme (filled symbols): the latter give access to much faster simulations, and therefore allow the study of diffusion at lower temperatures. Adapted with permission from [54] Copyright 2020 American Chemical Society. (d) Global exploration of Li-Si alloys using neural-network potential modelling [47]: here, an ANN scheme is coupled to two different computational approaches, a genetic algorithm (top) and MD (bottom), and the results are then translated into simulated voltage-filling curves with the help of DFT, which may be compared to experimental data, as shown on the right-hand side (see [47] and references therein). Plots reprinted from [47], with the permission of AIP Publishing.
The long-term value of ML potentials in battery materials research hinges on their ability to predict not only structural properties, but also those properties that are more directly tied to application. An example of this was recently demonstrated for cathode coating materials (many of which correspond to solid electrolytes, although the specific requirements are different in detail) [54]. Existing high-throughput studies aiming to find such materials, e.g. in Ref. [58], are typically concerned with materials properties as a function of the chemical composition, and they do not normally take the atomistic structure into account explicitly (this is well justified when isostructural materials are studied, but may pose challenges for new compounds whose structure is a priori unknown). One of the reasons is that a fully structure-specific treatment, by running long DFT simulations in an adequately sized supercell, measuring the atomic displacements over time, and fitting diffusivities to the resulting data, is often impracticable. Figure 2(c) now shows an ML-driven approach to this problem, for a range of materials from very good (β-Li 3 PS 4 , activation barrier of 0.23 eV) to rather poor ion conductors (Li 4 GeO 4 , 1.14 eV), all treated in the same computational framework either by DFT-driven ('ab initio') molecular dynamics (AIMD), or with a 'learn-on-the-fly' (LOTF) ML model [54]. Having access to simulation tools which are orders of magnitude faster than previously available makes it possible to study lower temperatures, at which atomic 'jumps' occur less frequently and therefore longer simulations are needed to predict the diffusivity. For example the temperature range in which simulation results are shown in figure 2(c) ends at 800 K for AIMD, but reaches down to 300 K for the LOTF MD (the specific range for these and other materials is given in Ref. [54]). The authors reported a total simulation time of more than 1300 nanoseconds across a variety of materials that they studied, corresponding to hundreds of millions of individual simulation steps. The work in Ref. [54] was carried out using the MTP approach [41], but the authors' general findings are likely transferable to other suitable ML potential fitting methods.
On the anode side ( figure 2(a)), the intercalation of Li ions into graphite is a 'textbook' reaction in the context of batteries-and somewhat more generally, the intercalation of Li into various nanostructured (nanotubes etc) and disordered forms of carbon ('hard' carbons) has been widely studied by DFT [9]. Such studies have also been extended to other alkaline and alkaline-earth ions, because hard carbons are (for example) a promising contender for Na-ion battery anodes [59]. It was previously shown that disordered forms of carbon can be accurately described by ML potentials, with evidence including various structural aspects [30,44] and the demonstrated ability to describe the growth of thin films in quantitative agreement with experimentally observed 'sp 3 ' content [46]. Among a range of emerging applications, GAP-driven simulations have been used to generate small-scale structural models of porous carbon at different densities [25] and to survey alkali metal intercalation in such structures, based on a subsequent combination with DFT energy and charge analyses (which will be discussed in the section below) [49].
ML potentials are widely coupled to MD simulations, but they may be used in combination with any method which requires energies as a function of atomic coordinates. In the context of battery anodes, Artrith and colleagues demonstrated the combination of ML (in that case, ANN) potentials with different ways to explore and quantify the local structures of amorphous Li-Si phases (figure 2(d)) [47]. In addition to MD, this included sampling using a genetic algorithm based on a neural-network energy prediction, and both sampling methods allowed the authors to probe many candidate structures at a given composition (x, Li filling, shown on the horizontal axis in the panels in figure 2(d)). Once the most favourable structures at a given x and their respective energies are known, the associated voltage-filling curves can be computed [9], providing one of the primary quantities of interest for battery materials. In another study of Li-Si phases, which introduced a more general 'implanted' neural-network methodology for the different parts of compositional space, the diffusivity was investigated and tested against experiments [48]-which is directly relevant to charging and discharging.
It is likely that elemental anodes, due to their relatively simple compositions but complex atomistic structures, will continue to be valuable targets for ML-driven modelling, extending previous successful studies based on DFT-driven structure exploration [60,61] whilst reaching larger simulation length scales than previously available (exemplified recently, e.g. in Ref. [62]). It is emphasised, again, that the present state of the field in figure 2(a) refers specifically to atomistic modelling-other ML methods, such as the generation of simulated microstructures from generative adversarial networks (GANs) are beginning to emerge as well [63], but are outside of the scope of the present Perspective.

What is next?
ML potentials are an increasingly popular approach to atomistic simulations, but compared to the established methodologies in battery modelling, they are only just getting started. In the future, there might be two directions for the more general uptake of ML potentials in the wider community [13]. On the one hand, carefully curated fitting databases and potentials are beginning to be provided online, and such 'general-purpose' potentials [24,64] would be expected to be applicable (in the sense of reaching close-to DFT accuracy) for a wide range of relevant problems and to be at least robust (not showing unphysical behaviour) for any reasonable atomistic structure at a given composition. The most broadly useful potentials will be those which deal with widely studied material systems-in the context of batteries, this is expected to include relatively simple chemical systems which nonetheless continue to pose open questions, such as the most prototypical oxide cathode materials. It is also interesting to mention the entirely automated ('on-the-fly') learning of interatomic potentials, which was recently coupled to a widely used DFT code [39]. This approach gives the user less control over the parameterisation of the potential, but in turn requires no (or almost no) input from their side, and it does not require the transferability of a general-purpose ML potential-instead allowing the user to speed up a simulation of a given, perhaps very specific, atomistic system which has been 'kick-started' by DFT.
New developments in ML potential fitting methodology are expected to substantially accelerate research in the field (quite literally, through faster algorithms and codes) and to enable applications to a wider range of battery-related problems. Most of the discussion within this short Perspective has focused on condensed phases, although molecular modelling is similarly beginning to benefit very strongly from ML potentials, often reaching far beyond what standard DFT approaches can do [64,65]. In the context of batteries, this promises a more accurate description of liquid electrolytes, and in particular of diverse chemical reactions taking place at solid-electrolyte interfaces, which involve the making and breaking of covalent bonds and are therefore not always easily accessible to existing simulation methods. One specific example are those adverse reactions that must be kept under control in lithium-air batteries [66].
In terms of materials to be studied, one may now anticipate the step up towards more chemically complex compositions, and ultimately to the discovery of new materials before they have been synthesised. Looking back at figure 2(a), one area that has so far been less widely explored with ML potentials is that of bulk and nanostructured cathode materials; indeed, not many ML-driven studies of complex oxides have been carried out, partly due to structural and compositional complexity, partly due to the presence of long-range electrostatic interactions. An example where all these challenges come together are a class of recently reported low-symmetric block and bronze-like phases for lithium-ion storage [67]: will ML potentials be able to treat such materials in the future? In terms of technical ability, early methodological developments have been made-including, for example, the 'learning' of environment-dependent charges with separate ANNs [52], or the construction of primarily charge-based force fields for strongly ionic materials [68]. In terms of the chemical complexity, the coupling to existing materials databases and high-throughput workflows may prove highly beneficial [18,69]-perhaps accelerated further by ML predictions of electrode voltages [70] or other application-relevant properties, which might narrow down the range of candidate compositions and structures.
In terms of methodological developments, it is also interesting to envision new synergies between established DFT methods and experimental data, both being enabled by 'bridging' through ML potentials. Figure 3(a) sketches a few connections of this type which are beginning to emerge, and which are expected to be relevant in the future. One recurring theme is the generation of systematic libraries of small, representative structural models which can then be studied further with regard to electronic structure and bonding ( figure 3(b)) [25,55]. A recent example in the wider field of energy materials, showcasing the usefulness of combining ML-based and existing simulation methods, is the computational modelling of CO 2 reduction catalysts based on chemically modified carbonaceous materials [71]. Candidate structures were explored in large-scale DFT computations [71], starting from an existing library of pure amorphous carbon surface models, the latter having been generated previously with an ML potential [55]. Another energy-related application case is a study of porous carbons in supercapacitors and specifically the effect of pore size on diffusivity [72], again building partly on ML-generated structural models [25] and moving on to much more complex chemical systems.
Turning to a possible direct synergy with experiments ( figure 3(a)), it would now be desirable to carry out systematic experimental benchmark studies: for example, one might synthesise silicon anodes with various degrees of local order in the laboratory, allowing the community to test ML-driven predictions with regard to both the pure phases [56] and the Li-intercalated ones [47,48]. More 'tailored' synthetic carbon materials, with parameters (such as the annealing temperature) tuned in fine steps, have begun to be reported very recently [75], and studies of this type would be expected to be highly valuable to the developers of ML potentials (and other atomistic simulation methods). A somewhat similar request for systematic experimental benchmark studies has been formulated recently for molecular quantum chemistry [76].
A clear long-term perspective is the connection of DFT, ML potentials, and experiments to achieve a combined structural solution, and thereby to arrive at a more holistic understanding of the atomistic and nanometre structure of batteries. Local structural probes including pair distribution function curves, as exemplified in figure 3(c) for Na-ion batteries [74], can be compared to the outcome of accurate simulations. Comparison with solid-state NMR data has been used as a means to validate ML-computed structures [56], and this technique is very widely employed in battery research as well [2]. For example, NMR experiments have been used to trace the chemical nature of Li [77] and Na [74] ions during insertion in porous carbons, both ex situ and operando (the latter being exemplified in figure 3(d)), and these measurements have begun to be linked to ML-driven computational work and DFT-based bonding analyses [25]. In the future, the tripartite combination of accurate input structures (from ML-driven simulations), accurate methodology for predicting the NMR parameters themselves (based on DFT methods, as reviewed in Ref. [78]), and 'real-world' experimental data is expected to become a powerful tool for battery research. In this context, one may mention recent progress towards predicting x-ray spectroscopy (XAS and XPS) fingerprints using A recent example of how atomistic modelling driven by ML potentials can be interfaced to DFT bonding analyses. After a library of disordered carbon structures has been generated in long ML-driven MD runs, DFT computations are carried out to study the intercalation behaviour of Li ions [49]. The plot is a distance map based on structural similarities, as detailed in Ref. [49] and references therein-colour-coded by the charges on individual atomic environments (from Li + almost down to metallic Li 0 ), which can be computed using DFT [73]. Adapted from Ref. [49] by permission of The Royal Society of Chemistry. (c) Pair distribution functions, plotted with data from Ref. [74]. Arrows indicate the appearance of subtle features on the nanometre scale due to Na + (de-) intercalation. (d) Operando 23 Na NMR measurements for the same system [74], showing the cyclic appearance and disappearance of a more metallic-like signal in the measured chemical shift during operation. Adapted from Ref. [74]-Published by The Royal Society of Chemistry. These experimental probes for the local structure can now be compared with the outcome of accurate, ML-driven atomistic simulations-both regarding structural models (panel c) and the physical property predictions that may be based on it (panel d).
combined ML-and DFT-based structural models [79,80]. Such strategies may serve both for validating atomistic simulations and for better understanding the outcome of electrochemical experiments in the future.
The 2019 Nobel Prize in Chemistry to Goodenough, Whittingham, and Yoshino [81] was awarded for fundamental studies that later led to an industrial-scale breakthrough. Many of the fundamental studies in battery research have been concerned with structure and reaction mechanisms on the atomistic scale. We now have more accurate atomistic insight into materials than ever, with modelling and experiment going hand in hand, and ML becoming applicable to 'real-world' problems in the field. With accurate ML-driven potentials and simulations in reach, there is a hope to thoroughly understand not only idealised crystal structures but grains and their interfaces, as well as partly or fully amorphous phases, all of which are of primary importance for batteries. Without doubt, it is exciting to envision that in the coming years, ML potentials may become a firmly integrated part of the battery modellers' toolkit-and to see what this will mean for the coming decade of energy materials development.