A Python interface to the Dutch Atmospheric Large-Eddy Simulation

“In atmospheric modeling, superparam eterization has gained popularity as a technique to improve the cloud and convection parameterizations of global atmospheric models, by coupling them to local, cloud-resolving models. We show how the different representations of cloud water at the local and the global models in superparameterization leads to a suppression of cloud advection in the large-scale model. This phenomenon is demonstrated in a regional superparameterization experiment with the global model OpenIFS coupled to the local model DALES (the Dutch Atmospheric Large Eddy Simulation), and in an idealized setup, where the large-scale model is replaced by a simple advection scheme. To mitigate the problem of cloud advection, we propose a scheme where the spatial variability of the local model's total water content is nudged in order to achieve the correct cloud condensate amount”. [Jansson et al., in preparation, 2020]. Abstract We present a Python interface for the Dutch Atmospheric Large Eddy Simulation (DALES), an existing Fortran code for high-resolution, turbulence-resolving simulation of atmospheric physics. The interface is based on an infrastructure for remote and parallel function calls and makes it possible to use and control the DALES weather simulations from a Python context. The interface is designed within the OMUSE framework, and allows the user to set up and control the simulation, apply perturbations and forcings, collect and analyze data in real time without exposing the user to the details of setting up and running the parallel Fortran DALES code. Another signiﬁcant possibility is coupling the DALES simulation to other models, for example larger scale numerical weather prediction (NWP) models that can supply realistic lateral boundary conditions. Finally, the Python interface to DALES can serve as an educational tool for exploring weather dynamics, which we demonstrate with an example Jupyter notebook.


Summary of project objectives
The overarching goal of this project is to come to a better understanding of cloud-climate feedbacks, leading to reduced uncertainty in climate sensitivity estimates. To achieve this, we pursue a computational strategy of developing 3-dimensional superparameterization (3dSP) by embedding 3-d convection-resolving Large Eddy Simulation (LES) models in each grid column of a global model (OpenIFS). The LES models are embedded as a two-way nesting (or two-way coupling): the global model column state drives the LES model, and the LES feeds back to the global model. The nested LES models replace traditional convection parameterization schemes in the global model columns. We work with DALES, the Dutch Large Eddy Simulation model, as the convection-resolving LES. Because superparameterization with fully 3-d LES models is computationally very expensive, we develop the model coupling in such a way that it can be applied regionally, i.e. to user-selected model columns of OpenIFS. The computer resources of this special project are intended for performing simulations with the coupled (OpenIFS-DALES) 3dSP model.

Summary of problems encountered
(If you encountered any problems of a more technical nature, please describe them here.) In 2017 we found out that AMUSE does not work well with the Cray MPI which is installed on the ECMWF Cray, the reason being that when AMUSE spawns worker processes it launches them using MPI_Comm_spawn(), which the Cray MPI does not support. We solved this problem with a workaround where all workers are launched at the start of the simulation in a regular MPI job, after which the appropriate MPI communicators are created. This works for us, since we know ahead of time how many workers are needed for a particular simulation. Supposedly new versions of the Cray MPI will include MPI_Comm_spawn(). We are still using the work-around with pre-launched worker codes.

Experience with the Special Project framework
(Please let us know about your experience with administrative aspects like the application procedure, progress reporting etc.) We found the administrative aspects of the Special Project framework fairly straightforward to handle and not too demanding. For progress reports, we especially appreciate the possibility to present results through a short summary appended with an existing scientific report, this allows to convey detailed information about scientific results but saves time preparing the progress report.
We appreciate the support from ECMWF in using OpenIFS and in obtaining initial states for the model (mainly by Glenn Carver), and the technical support by the helpdesk for using the HPC equipment.

Summary of results
(This section should comprise up to 10 pages, reflecting the complexity and duration of the project, and can be replaced by a short summary plus an existing scientific report on the project.) Our article describing the regional superparameterization set-up and initial results appeared in 2019 in Journal of Advances in Modeling Earth Systems (see https://doi.org/10.1029/2018MS001600). One figure of our article was chosen as cover illustration of the journal issue; moreover, the article was highlighted as "Research Spotlight" in Eos (see https://doi.org/10.1029/2019EO132121). The preprint version of this paper was already attached to the progress report that we submitted in June 2019. The published article is available via https://doi.org/10.1029/2018MS001600 (fully Open Access). For completeness, the abstract is quoted once more below: We investigated the computational performance of the coupled OpenIFS-DALES model in another paper [Van den Oord et al., 2020-b], also under review. Because of the review procedure rules we cannot append the preprint to the report at this point (however we will make it available once the review procedure is completed). The abstract for the paper is: "We describe performance modeling and optimization efforts of (regional) superparametrization of the ECMWF weather model OpenIFS by cloud-resolving, three-dimensional large-eddy simulations. This setup contains a two-way coupling between a global meteorological model that resolves large-scale dynamics on the global scale, with many local instances of the Dutch Atmospheric Large Eddy Simulation (DALES) \resolving cloud and boundary layer physics. The two MPI-parallel Fortran codes interact through a Python interface layer within the OMUSE framework. We study the performance and scaling behavior of the LES models and the coupling code and present our implemented optimizations. We mimic the observed load imbalance with a simple performance model and present strategies to improve hardware utilization in order to assess the feasibility of a world-covering superparametrization". [Van den Oord et al., 2020-b] Finally, from our simulations and experiments with the OpenIFS-DALES coupled model we found that there is a fundamental and important challenge of how to advect clouds and small-scale variability into (and out of) superparameterized model columns. We completed our investigation, including numerical simulations, of this issue and are now in the process of writing a journal publication on it, to be submitted within ca 2 months. The (preliminary) abstract for this paper in preparation is: "In atmospheric modeling, superparameterization has gained popularity as a technique to improve the cloud and convection parameterizations of global atmospheric models, by coupling them to local, cloud-resolving models. We show how the different representations of cloud water at the local and the global models in superparameterization leads to a suppression of cloud advection in the large-scale model. This phenomenon is demonstrated in a regional superparameterization experiment with the global model OpenIFS coupled to the local model DALES (the Dutch Atmospheric Large Eddy Simulation), and in an idealized setup, where the large-scale model is replaced by a simple advection scheme. To mitigate the problem of cloud advection, we propose a scheme where the spatial variability of the local model's total water content is nudged in order to achieve the correct cloud condensate amount". [Jansson et al., in preparation, 2020].

Future plans
(Please let us know of any imminent plans regarding a continuation of this research activity, in particular if they are linked to another/new Special Project.) The superparameterization framework developed within this special project will be used for simulations of the EUREC4A campaign. A special project request "Mesoscale Organisation of Shallow Cumulus Convection" including this research topic will be submitted in June 2020 by P. Siebesma et al.

Abstract
We present a Python interface for the Dutch Atmospheric Large Eddy Simulation (DALES), an existing Fortran code for high-resolution, turbulenceresolving simulation of atmospheric physics. The interface is based on an infrastructure for remote and parallel function calls and makes it possible to use and control the DALES weather simulations from a Python context. The interface is designed within the OMUSE framework, and allows the user to set up and control the simulation, apply perturbations and forcings, collect and analyze data in real time without exposing the user to the details of setting up and running the parallel Fortran DALES code. Another significant possibility is coupling the DALES simulation to other models, for example larger scale numerical weather prediction (NWP) models that can supply realistic lateral boundary conditions. Finally, the Python interface to DALES can serve as an educational tool for exploring weather dynamics, which we demonstrate with an example Jupyter notebook.
Keywords: Large-eddy simulation, Atmospheric sciences This library version of DALES can also be used independently of OMUSE 108 or Python interfaces, since its functions can be called directly from Fortran. 109 The second modification that has been made is the option to pass an MPI