This thesis explores the development of artificial neural network (NN) surrogates to accelerate materials simulations across scales, addressing the computational limitations of traditional methods in both atomistic and continuum modeling. At the atomic scale, machine learning interatomic potentials (MLIPs) represent a critical advancement, bridging the gap between the high computational cost of density functional theory (DFT) and the limited transferability of empirical potentials. We develop an MLIP using the DeePMD-kit framework, designed to achieve near-DFT accuracy while substantially reducing computational cost, with particular application to pressure-induced crystal phase transitions in germanium. The potential is trained using an iterative active learning strategy with strategic sampling of transition-state configurations, employing advanced techniques such as solid-state nudged elastic band and dimer methods to adequately explore the potential energy landscape. Comprehensive benchmarking and validation demonstrate the model's capability to predict transition pathways and pressure-dependent kinetics accurately. At the continuum scale, we develop a deep-learning-based surrogate model to approximate phase field (PF) evolution with high fidelity and significant computational speedup. Specifically, we design a convolutional recurrent neural network architecture implemented in PyTorch to accelerate PF simulations of spinodal decomposition in coherently strained alloy systems. This system was selected as a case study because its complex and diverse microstructural patterns provide an ideal testbed for evaluating the framework's capabilities. The emphasis is on the generality of the proposed architecture, which is designed to be versatile and applicable to a broader class of continuum models beyond the specific application studied.

This thesis explores the development of artificial neural network (NN) surrogates to accelerate materials simulations across scales, addressing the computational limitations of traditional methods in both atomistic and continuum modeling. At the atomic scale, machine learning interatomic potentials (MLIPs) represent a critical advancement, bridging the gap between the high computational cost of density functional theory (DFT) and the limited transferability of empirical potentials. We develop an MLIP using the DeePMD-kit framework, designed to achieve near-DFT accuracy while substantially reducing computational cost, with particular application to pressure-induced crystal phase transitions in germanium. The potential is trained using an iterative active learning strategy with strategic sampling of transition-state configurations, employing advanced techniques such as solid-state nudged elastic band and dimer methods to adequately explore the potential energy landscape. Comprehensive benchmarking and validation demonstrate the model's capability to predict transition pathways and pressure-dependent kinetics accurately. At the continuum scale, we develop a deep-learning-based surrogate model to approximate phase field (PF) evolution with high fidelity and significant computational speedup. Specifically, we design a convolutional recurrent neural network architecture implemented in PyTorch to accelerate PF simulations of spinodal decomposition in coherently strained alloy systems. This system was selected as a case study because its complex and diverse microstructural patterns provide an ideal testbed for evaluating the framework's capabilities. The emphasis is on the generality of the proposed architecture, which is designed to be versatile and applicable to a broader class of continuum models beyond the specific application studied.

Fantasia, A (2026). Accelerating Materials Simulations Across Scales: Artificial Neural Networks for Atomistic and Continuum Modeling. (Tesi di dottorato, , 2026).

Accelerating Materials Simulations Across Scales: Artificial Neural Networks for Atomistic and Continuum Modeling

FANTASIA, ANDREA
2026

Abstract

This thesis explores the development of artificial neural network (NN) surrogates to accelerate materials simulations across scales, addressing the computational limitations of traditional methods in both atomistic and continuum modeling. At the atomic scale, machine learning interatomic potentials (MLIPs) represent a critical advancement, bridging the gap between the high computational cost of density functional theory (DFT) and the limited transferability of empirical potentials. We develop an MLIP using the DeePMD-kit framework, designed to achieve near-DFT accuracy while substantially reducing computational cost, with particular application to pressure-induced crystal phase transitions in germanium. The potential is trained using an iterative active learning strategy with strategic sampling of transition-state configurations, employing advanced techniques such as solid-state nudged elastic band and dimer methods to adequately explore the potential energy landscape. Comprehensive benchmarking and validation demonstrate the model's capability to predict transition pathways and pressure-dependent kinetics accurately. At the continuum scale, we develop a deep-learning-based surrogate model to approximate phase field (PF) evolution with high fidelity and significant computational speedup. Specifically, we design a convolutional recurrent neural network architecture implemented in PyTorch to accelerate PF simulations of spinodal decomposition in coherently strained alloy systems. This system was selected as a case study because its complex and diverse microstructural patterns provide an ideal testbed for evaluating the framework's capabilities. The emphasis is on the generality of the proposed architecture, which is designed to be versatile and applicable to a broader class of continuum models beyond the specific application studied.
BERGAMASCHINI, ROBERTO
ML Potentials; DFT; Phase Field Modeling; Convolutional NNs; Recurrent NNs
ML Potentials; DFT; Phase Field Modeling; Convolutional NNs; Recurrent NNs
Settore PHYS-04/A - Fisica teorica della materia, modelli, metodi matematici e applicazioni
English
31-mar-2026
38
2024/2025
open
Fantasia, A (2026). Accelerating Materials Simulations Across Scales: Artificial Neural Networks for Atomistic and Continuum Modeling. (Tesi di dottorato, , 2026).
File in questo prodotto:
File Dimensione Formato  
phd_unimib_813398.pdf

accesso aperto

Descrizione: Accelerating Materials Simulations Across Scales: Artificial Neural Networks for Atomistic and Continuum Modeling
Tipologia di allegato: Doctoral thesis
Dimensione 4.53 MB
Formato Adobe PDF
4.53 MB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10281/599604
Citazioni
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
Social impact