Andrea Ferretti

You are here

07 Nov 2016

Andrea Ferretti

Andrea Ferretti (CNR-NANO) got a master degree in Materials Engineering and a PhD in Physics at the University of Modena and Reggio Emilia under the supervision of Elisa Molinari. In 2009-2011 he spent a joint postdoctoral appointment at MIT (MA,USA) and University of Oxford (UK) in the group of  Nicola Marzari. Since 2011 he is researcher at CNR-NANO (permanent since 2012).

 

Andrea Ferretti works in the field of condensed-matter and solid-state physics, performing ab initio simulations at the DFT level and beyond.
Current research interests focus on the ab initio study of the electronic and transport properties of organic semiconductors and hybrid interfaces.

 

A list of topics includes:
(i) The interaction of large conjugated molecules with metal surfaces, including the development of effective methods to study the electronic and optical spectroscopies of such systems;
(ii) Electronic structure and transport in nanoscale conductors: the role and the inclusion of correlation effects; theory and applications.  Andrea Ferretti is developer of scientific software (WanT, Yambo, Quantum ESPRESSO) and author of 40 scientific publications in international and peer-reviewed scientific journals (with about 920 citations, h-index=18).

He is also active as a scientist within the MaX - MAterials Design at the eXascale -centre of excellence in HPC applications.

 


When: Day 2 - 15th November, Session 8: DMP Technical Services #Part 1, 10:30 - 11:30

Handling data and workflows in computational materials science: the AiiDA initiative.

Abstract. High-throughput computing (HTC) is emerging as an effective methodology in computational materials science for the discovery of novel materials. Its adoption is spreading rapidly and HTC is becoming an essential tool for computational materials scientists.
The reasons are twofold: on one side, quantum simulation engines have reached unprecedented accuracy, allowing to predict materials properties with high precision.  On the other hand, the throughput capacity of current high-performance computing (HPC) resources keeps increasing exponentially, doubling every 14 months. Nevertheless, running a high-throughput investigation is a not trivial task. A typical study may require to run and analyze tens of thousands of calculations, making it essential to have automated tools for the submission and management of calculations as well as for the handling and sharing of the
produced data.

 

In this talk I will describe some recent advancements made possible by the development of a dedicated tool, AiiDA (http://www.aiida.net).

 

This framework automatically stores calculations and their inputs and outputs in a suitable repository to ensure reproducibility of each calculation, to persist the provenance of the data, and to allow for fast querying of the results via tailored databases.
Most importantly, it provides a framework to encode, run and reproduce not only single calculations, but also complex workflows. This activity is part of the MaX - MAterials design at the eXascale - Centre of Excellence for high performance computing.