Permafrost modelling with OpenFOAM®: New advancements of the permaFoam solver

Article history: Received 7 September 2021 Received in revised form 25 August 2022 Accepted 6 September 2022 Available online 14 September 2022


Introduction
The permafrost areas of the northern hemisphere are currently warming twice time faster than the global average [17].Currently, fast and important changes of both hydrological [58] and thermal [34] states of the northern continental surfaces are observed in response to permafrost thaw.In that context arctic and sub-arctic areas are prone to major biogeochemical and ecological transformations due to permafrost thaw, with strong associated feedbacks on greenhouse gas cycling (e.g.: [25,59]).Fast-increasing economic activity in cold regions is also impacted by climate change [14,51,46,2,23,50,24], and may generate itself important impacts on permafrost at local scale [57,47,33].Overall the evolution of permafrost in changing environmental conditions is of primary importance [38].Thus, the development of advanced numerical modelling tools of permafrost dynamics is an active research field (e.g.: [37,9,26,5,20,32,60,63,67]), in order to help predicting the impact of permafrost thaw on arctic thermo-hydrologic functioning and providing mechanistic understanding of Arctic environmental changes under climate warming.This knowledge is necessary to further understand carbon cycling and contaminant/nutrient transport, to assess risks and opportunities for water resources, sustainable urbanization, agriculture and general sustainable development of the (sub-)Arctic.
Since permafrost is year-round frozen soil, complemented by an active layer close to the surface freezing and thawing seasonally, dealing with its dynamics implies considering both water and heat transfers within soils.However, these two phenomena are non-linear and strongly coupled, which make their numerical resolution highly challenging.For instance, the soil apparent heat transfer properties, such as its apparent thermal conductivity or heat capacity, are impacted by its water content.On the other hand, the freezing of the pore water also induces a dramatic decrease in the apparent water conductivity.These couplings and non-linearities prevents the complex set of equations describing the problem from being solved analytically without strong hypothesis [31].For the same reasons, the numerical approach represents a huge challenge, since it requires both a good stability of the solvers and a large amount of resources to perform the computation [58].That is the reason why the need of high-performance computing in the field of cryohydrogeological simulation is widely acknowledged [43], and several permafrost modelling tools with High Performance Computing capabilities have been recently developed (e.g.: [26,45]).
Our contribution to the development of permafrost modelling is an OpenFOAM ® cryohydrogeological solver, named "permaFoam".The permaFoam solver presented here is based on a previous version published as electronic supporting information in Orgogozo et al. [41] and on the RichardsFoam3 solver [39,40,42].By solving the coupled equations of heat transfer and water flow in variably saturated porous media with freeze/thaw of the pore water and evapotranspiration, it enables to model permafrost subsurface dynamics.Its development in the framework of OpenFOAM ® allows to use efficiently modern supercomputing infrastructures (e.g.: [48,11]), and to benefit from the momentum of a large community including both academic and industrial users.

Physical equations
The physical problem at stake is the coupled transfer of water and heat in a variably saturated porous medium, taking into account the freezing of pore water and the evapotranspiration water uptake.As a cryohydrogeological solver, permaFoam only deals with sub-surface processes, modelling permafrost itself.Nonetheless permafrost dynamics are strongly coupled with surface processes such as snowpack dynamics (e.g.: [21]) or evapotranspiration (e.g.: [41]), which control surface energy and mass balance and thus heat and water fluxes through the top boundary of the permafrost.Thus in many cases, in order to build proper top boundary conditions when using permaFoam, field measurements, assumptions on surface processes or even coupling with adequate surface processes models will be needed.Since permaFoam, as a cryohydrogeological solver, simulates the heat and water transfers within permafrost, this paper focuses on the sub-surface processes.As said above the coupling with surface processes models such as for instance Surface Energy Balance models (e.g.: [64,52,66]) may be necessary in some cases for building top boundary conditions for permaFoam.One can refer to Endrizzi et al. [9] or to Painter et al. [45] for examples of permafrost modelling tool with integrated coupling of surface and subsurface processes.Thus per-maFoam focuses on modelling permafrost at Darcy scale, dealing with transfers in porous media that may include four different phases: the solid phase, the liquid water phase, the frozen water (ice) phase, and the air phase.The soil is considered as an undeformable porous media.Moreover, cryosuction and ice volumetric expansion are neglected.One can refer to Stuurop et al. [53], for a modelling work about cryosuction, and to McKenzie et al. [37] and Westermann et al. [61] for examples of permafrost modelling works in which ice volumetric expansion is neglected.Due to these different approximations, cryo-mechanical processes are not considered.One can refer to Huang and Rudolph [18] and to Huang et al. [19] for modelling works dealing with coupled thermo-hydromechanical phenomena in variably saturated porous media with freeze/thaw.The mathematical description adopted in the present work has been widely used in the literature (e.g.: [30,13]), and recently discussed in Orgogozo et al. [41].It involves basic coupling between water flow and heat flow, in the sense of Yu et al. [65].The two primary variables that describe the physical state of water into the soil are the generalized water pressure h [m] (positive in saturated conditions, negative in unsaturated conditions -see [49]) and the soil temperature T [K] (defined by assuming local thermal equilibrium -see [15] for a discussion of this assumption).The two equations governing these primary variables are a modified Richards equation with a source term accounting for actual evapotranspiration (Eq.( 1)) for the generalized water pressure h and a macro-scale heat transfer equation for the porous medium with a term of latent heat exchange (Eq.( 2)) for the soil temperature T .
Then the parameterizations and constitutive laws that are needed to estimate the transfer properties embedded in the three main equations above may be found below (Eq.( 3) to ( 14)).The different variables appearing in the equations are listed and briefly described in Table 1.
-Generalized Darcy's law: Table 1 Variables and parameters of the system of equations solved by permaFoam (in the source code the used units are the SI base units).
Heat capacity of liquid water [J m −3 K −1 ] (SI base units: Heat capacity of liquid water [J m −3 K −1 ] (SI base units: Heat capacity of the soil [J m −3 K −1 ] (SI base units: Generalized water pressure head (primary variable of flow equation ( 1) Relative hydraulic conductivity with respect to water freezing [-] K f reezingmin Min.value of relative hydraulic conductivity with respect to water freezing Relative hydraulic conductivity with respect to water saturation [-]  -Apparent hydraulic conductivity [37]: -Mualem-van Genuchten parameterization of soil retention curve for the pressure-based formulation of Richards equation [54,28,4]: -Relative hydraulic conductivity with respect to water freezing [13]: -Soil Freezing Curve function [55]: One can refer to Dall'Amico et al. [8] and Painter and Karra [44], for an alternative approach of handling of the freezing behaviour of soils.
-Ice volume fraction calculated from the total water volume fraction: One should note here that since we consider a non-deformable porous medium, we neglect the volumetric expansion of ice [30].
-Estimator of actual evapotranspiration [40,41]: -Equivalent thermal conductivity of the medium -geometric mean [10]: -Equivalent heat capacity of the medium -arithmetic mean [27]: Seven non-linearities and six couplings occur in these two coupled equations, making them especially difficult to solve numerically.This is the main reason why high-performance computing is required here, and thus why we choose to develop permaFoam in the framework of OpenFOAM ® .Please note that the numbering of theses equations is also used in the comments of the main source file of permaFoam permaFoam.C, for the sake of clarity.Please note also that the units used in permaFoam sources are the SI based units presented in Table 1.

Resolution strategy
The numerical methods used to solve these two coupled, nonlinear equations are classical: DIC (Diagonal Incomplete Cholesky) pre-conditioning and PCG (Preconditioned Conjugate Gradient) linear solver for the equation of water transfers, DILU (Diagonal Incomplete Lower Upper) pre-conditioning and BiPCG (Preconditioned Bi-Conjugate Gradient) linear solver for the heat transfer equation, two Picard loops (fixed point method), one for the linearization of each equation, sequential operator splitting for the handling of the couplings between the two equations, explicit resolution of the phase change (freeze/thaw of pore water) and implicit/backward Euler method for temporal integration with adaptive time stepping (see [41] for more numerical details).Fig. 1 presents a simple schematization of the adopted numerical strategy.
The use of an operator-splitting approach for dealing with coupled equations is widely adopted in porous media numerics [7,22,35], although it leads to splitting errors that come in addition to the truncature errors related to the finite volume discretisation.Since splitting errors are generally increasing while coarsening the used spatio-temporal discretisations, a peculiar attention should be paid to convergence studies for numerical resolutions of coupled problems based on an operator splitting approaches.A possible alternative to operator splitting would be to deal with a single block coupled discretisation of the equations (e.g.: [3,6]).The adaptive time stepping approach is a so-called Empirically Based Adaptive Strategy approach [62]; it is analogous to the one presented in Orgogozo et al. [39].
Given the strong non linearities and couplings encountered, few references analytical solutions exist for the considered set of equations, and only for very simplified problems.Thus to perform code-to-code intercomparison exercises is a commonly accepted method for validating numerical simulators for cryohydrogeology, and so has been done for the validation of permaFoam implementation for the simulation of coupled heat transfer and water flow with freeze/thaw in saturated porous media [13].Considering water flow in variably saturated porous media, permaFoam inherits of the validation that has been done for RichardsFoam [39] since it has the same numerics for Richards equation resolution.

New features embedded in this release of permaFoam
This permaFoam package encompasses the new features that have been introduced in RichardsFoam3 [42] along with other specific developments: (i) better handling of memory use during computation; (ii) possibility of on-the-fly modification of the control parameters of the linearization procedure; (iii) new boundary conditions to be compiled along with the solver itself, and dedicated to the simulation of rain infiltration (rainFallFlux) or no infiltration (noRainFlux) conditions while allowing exfiltration with OpenFOAM ® stand-alone (i.e.without mandatory use of swak4foam as it was the case in RichardsFoam2 -Orgogozo [40] -see section 3.3); (iv) computation of the field of piezometric head along simulation.NB: in the demonstration cases, a way to implement a constant piezometric head boundary conditions with OpenFOAM ® is proposed.
(v) new functions that allow to represent seasonal variability of boundary conditions in an easy way with OpenFOAM ® stand-alone, using the 'seasonal' (see Section 3.4) function object, and including the 'thawing' option (see section 3.5).
(vi) use of postprocessing procedures with OpenFOAM ® standalone (i.e.without mandatory use of swak4foam as it was the case in Richards Foam2 -Orgogozo [40]); (vii) update of the capillary capacity parametrization according to the pressure-based formulation of Richards equation proposed in Kavetski et al. [28] (see equation ( 6) and associated references).
(viii) several other minor rewritings/cleanings of the code.

The rainFallFlux and the noRainFlux boundary conditions
These two peculiar boundary conditions deal with the hydrologic relations between the sub-surface and the surface in a very simple, sub-surface oriented way, since permaFoam is a cryohydrogeological solver.The goal is to enable to impose an infiltration flux (>0 with rainFallFlux, =0 with noRainFlux) through a boundary (typically the top surface of the soil) while prescribing that in case of saturation of the boundary cells the imposed flux condition is turned into an imposed pressure head condition, with a pressure head equal to the atmospheric pressure.This is equivalent to the top pressure boundary conditions constructed with swak4foam in Orgogozo et al. [41] -one can see the discussion of the application of this kind of boundary conditions to cryohydrogeological modelling in this previous paper.In such a way, the computed infiltration does take account of the fact that no rain water may come in the soil if the soil surface is already saturated, and exfiltration fluxes (water coming out of the soil surface due to subsurface flow) may occur, and be quantified.

The 'seasonal' function object
The seasonal function object has been developed specifically for enabling easy implementation of simple seasonal variation of boundary conditions that are very useful for modelling permafrost, meanwhile it could be applied in other contexts when time varying boundary conditions are needed.It allows to mix two time evolving modes, and each mode can be a sine, a cosine or a square function.Then the two modes are concatenated according to a user specified periodic time frame, using again a basic time dependent function (dualMode true; modeselector square1;).In the example below (see Fig. 2), extracted from the initial temperature volScalarField 0/T of the demoCase_sinusoidal-ClimateForcings (see the tutorials in the permaFoam package), two sinusoidal modes are concatenated using a simple mode selector making mode 1 used when modeSelector=0 and mode2 used when modeSelector=1.
The implementation in the dictionary of the patch top (representing the soil surface) within the 0/T file of the demoCase_sinu-soidalClimateForcings tutorial is the following:  There is a strong interest of using continuous time dependent evolutions for boundary conditions, because in case of discontinuities steep fronts occur resulting in a substantial increase of numerical difficulties and hence computing time.In the meantime, using only one purely sinusoidal evolution for catching seasonality may be a problematic over-simplification for some analysis.The seasonal function object has been developed to propose a trade-off between these two concerns.

The 'thawing' option
Additionally, there is a peculiar feature that has been added for establishing soil surface boundary conditions in cold environment: the thawing option for the seasonal function object.When thawing or freezing occurs at the top surface, one may want to start or to stop a given process (for instance, infiltration modelled with a rainFallFlux boundary condition) depending on the thermal status of the faces of the top surface patch.Generally, the permaFoam solver deals natively with such threshold effects by considering a temperature of freezing prescribed in the scalar field Tmelt.Nevertheless, since the apparent hydraulic conductivity of a frozen soil is very low, imposing a water flux with for instance rainFallFlux on a frozen soil may lead to very high pressure heads, and so may generate numerical difficulties.This may occur for instance when a snowpack melts above a frozen soil.Since in such case the amount of infiltrated water is very low, it may be of interest to directly turn the infiltration flux to zero as long as the top soil is frozen.In some cases one may even want to slightly shift the threshold of temperature for easing the numerical resolution, for instance in case of thick cells at the top surface or fast-changing conditions at the patch.This is allowed by the thawing option, which is an indicatrix that has the value 1 when T >=Tmelt+Tshift, and the value 0 when T<Tmelt+Tshift.One can find an example of use of the thawing options in the 'permaFoam_demoCase_sinusoidal-ClimateForcings' test case provided in the permaFoam package, in the top patch describing the boundary field of the initial pressure file "0/psi".In this patch, the temperature of start of infiltration is fixed at Tmelt+Tshif = 273.15  Such a feature may help to ease significantly the numerical resolution of cases in particularly steep conditions at the period of initiation of thawing or freezing.The approximation related with the use of the 'thawing' option may generate additional errors in the numerical resolutions, so problem-wise trade off must be made between the accuracy of the simulation and the decrease of the computation time.For instance turning off the infiltration when the top soil temperature is close to 0 • C may have significant impacts on the modelling of hourly to daily time scale variations of the active layer hydrological and thermal state during snowmelt [16].Meanwhile, if the interest is in seasonal or multi-annual trends, such errors on high-frequency variabilities might be acceptable, depending of the application at stake.

3D computation in Kulingdakan watershed, Central Siberia
The Kulingdakan watershed has been studied numerically with a special focus on evapotranspiration fluxes in Orgogozo et al. [41].In this previous study 2D computations of thermo-hydrological fluxes in the slopes of the catchments were carried out.More recently 3D simulations of the permafrost of this watershed have been started, as preliminary works for simulations under climate change to come.One can see in Fig. 3 a brief presentation of the Kulingdakan watershed along with views of the simplified representation used for the spatial discretization of the 3D simulations.
In order to simulate the seasonal permafrost dynamics under current climate conditions, the permafrost is simulated for multiannual means of climate forcings (top soil temperature, liquid rain rate) with a cycling of simulations along several years to reach a kind of dynamical equilibrium under current conditions (spin-up).One can see on Fig. 4 the computed seasonal evolution of the temperature profiles in the middle of the South aspected slopes and in the middle of the North aspected slopes.
A good agreement is obtained with respect to the field observations.For instance, the computed maximum thicknesses of the active (thawed in summer) layer are 0.8 m in NAS and 1.1 m in SAS, while the observed maximum active layer thicknesses are 0.6 m and 1.2 m in SAS [12].A slightly better agreement was found on that metric with the 2D computations of Orgogozo et al. [41], likely due to the fact that the spin-up (cycling of several years of modelling for estimating the initial conditions) was better completed in this previous study.In 3D these simulations are costly in terms of CPU hours, requiring several hundreds of hours on OCCIGEN supercomputer using parallel computation on 4000 cores simultaneously, thus we did not push the cycling as far as in this previous study.The used mesh was composed of 57.10 6 cells, while the average time step was 303 s (minimum time step: 0.76 s; maximum time step: 600 s).This illustrates the need of High Performance Computing for permafrost modelling.

2D computation in Syrdakh watershed, Eastern Siberia
The Syrdakh study site is located roughly 100 km NE of Yakutsk in Central Yakutia (Eastern Siberia, Russia).A cross section to a river has been instrumented for thermal and hydrological measurements from 2012 on.This landscape unit is a base element of a water catchment (see Fig. 5).Monitoring of temperature is implemented in the river and below (Fig. 5a locations as red crosses).Ground temperatures are monitored at various depths and distances from the river (Fig. 5a -F8, F4, F3 provide monitoring  depths at 1, 2, 3, 4 meters, F13 and F14 monitor depths of 30 cm, 1.2 m, 2 and 3 m).Additionally, soil temperature close to the ground across the valley are monitored to provide with surface boundary conditions for the numerical simulation of coupled TH processes (3 zones are discriminated, shadowed and mixed zone, river bed, see Fig. 5a).The hydrological variables monitored are river levels, volumetric water contents at various depths two distances from the river (VWC1 at High and Low locations, displaying VWC at 10, 40 and 70 cm depths).Field studies, generally carried out in September, provide access to topography, soil properties, vegetation cover, supra-permafrost water levels and active layer depths across the valley.Thermal and hydrological parameters are measured from pits.More in Grenier et al. in prep.
The numerical simulation presented here is a preliminary run aiming at providing a basic representation of coupled TH processes and identify data or process gaps.After a spin-up, a converged mean year representing conditions encountered from 2012 to 2019 is obtained.The full set of coupled TH equations including vegetation transpiration and evaporation is solved (Eq.( 1) and ( 2)).
The modelled domain represents the left bank of the valley containing the denser monitoring network: 25 m wide for 5 meters of ground (Fig. 5a).The surface thermal boundary conditions are imposed temperature signals, issued from near surface temperature monitoring (3 zones -shady part, mixed zone and river bed).Surface hydrological conditions are precipitation over land and imposed head for river bed.All other boundary conditions are zero-flux (temperature and water).Evapotranspiration source term is introduced as a term varying exponentially with depth and according to seasons.Average year conditions are repeated from initial conditions until convergence of temperature and liquid volumetric water content fields (spin-up).Temperature and volumetric liquid water content fields are represented in Fig. 5b displaying conditions at mid-September: the deepest active layer zone of the year (unfrozen region above permafrost) is obtained and the influence of the river recharging the supra-permafrost aquifer is visible leading to high liquid water contents as the vegetation has stopped pumping at this time of the year.
Results from this average year are then compared with averaged monitoring series over the 2012-2019 period for all monitoring lo- cations (Fig. 5a).Results in Fig. 6 show that seasonal evolution is well captured when it comes to temporalities and trends, for all monitoring depths.The propagation of freezing and thawing are qualitatively well represented while vegetation and evaporation uptake are active during summer, competing with vertical infiltration from precipitations and lateral river inflows.However, the temperature and volumetric water content time series are not quantitatively well-matched requiring future parameter adjustments (esp.freezing curve and vegetation uptake level and history).This preliminary simulation demonstrates the ability to represent realistically the seasonal evolution of a river-valley system with the ad hoc set of complex processes.

Parallel performances
In this section, we evaluate the numerical performances of per-maFoam, and especially its scalability up to a large number of cores.The physical problem used for this study is analogous to the one described by the "democase_sinusoidal_forcing" test case provided together with the solver.Only the spatial scales, and espe-cially the horizontal extensions of the considered domains, change compared to the basic test case.Three geometries have been used to evaluate performances: they are described by an inclined square with either 102 m, 307 m or 700 m long sides and a thickness of 1 meter, forming a 3D-volume very similar in shape to one side of the geometry studied in the previous section.Each lateral extension corresponds to a different mesh size, from 20.10 6 to 10 9 cells, so that the refinement of the discretization is kept unchanged between cases.The boundary conditions include Rain-FallFlux and noRainFlux conditions, while the initial condition is chosen so that thawing occurs in the domain (initial time during June).In this way, this configuration is considered representative of real 3D cases and takes into account all the physical processes that permaFoam models.Performances are evaluated by simulating 1 min of physical time by repeating 60 time steps of 1 second.
The first mesh we study is a 512x512x85 grid, referred as "20M" mesh.We show in Fig. 7 that the topological distribution of decomposed domains associated with each MPI-process on the grid can dramatically affect the performances.The domain is split following one, two or the three directions of space, as each curve  label on Fig. 7 indicates; for example, in the (xz) case, the subdomains are created by cutting the geometry in the x and z directions.For each case, we perform the test on a range of cores from 36 to 1800 cores.The dimensionless execution times are calculated by dividing the results by the maximum execution time.A large difference between the different strategies is obtained, since one order of magnitude is separating the fastest decomposition strategy from the slowest one.The best performances are obtained using the (xy) decomposition and so affecting to each MPI-process a subdomain covering the whole thickness of the domain.This result is expected as the main computational charge is located on the thaw front, i.e. close to the surface of the soil.Thus, any decomposition in the (z) direction deteriorates the simulation load-balancing.
Using this decomposition strategy, we evaluate the parallel performances of permaFoam with a strong scaling study on three different HPC centers: CALMIP (Olympe supercomputer), CINES (Occigen supercomputer) and TGCC (IRENE supercomputer, ROME partition), and for three different problems of increasing sizes: one with a 20.10 6 cells mesh, noted 20M, one with a 200.10 6 cells mesh, noted 200M, and one with 10 9 cells mesh, noted 1B.This comparative study gives insights into the parallel performances by including both different hardware environments (processors, architectures, . . .), cores number (up to 16384 cores) and computational loads.Results are shown in Fig. 8. PermaFoam demonstrates ex-cellent parallel performances on every tested supercomputers.For instance it's showing results beyond ideal scalability up to 6000 cores for the 20M case on every used supercomputers.This superscalable behaviour, already noticed and discussed in Orgogozo et al. [39], may come from memory cache effects, as well as from a dependence of linear solvers operation conditions to the size of the subdomains, although no effect on results have been observed yet.
When considering a relatively small mesh of 20.10 6 cells, scalability tends to decrease with higher number of cores, even if a 50% parallel efficiency is still observed at the highest number of cores (16384 cores on IRENE-ROME -Fig.8 (c))).With large amounts of cores, the communication costs relative to the mesh size are being expected to affect the performances.Indeed, at 4000 cores, where the scalability is starting to decrease, 70% of the cells in memory of a given MPI-process is a ghost-cell, used for communication or boundary condition.
To evaluate this effect, a 200M cells mesh is created by increasing the cell number in both the x and y directions (1536 cells in each direction), as well as the size of the surface plan (307 m long side).In this way, the number of cells is increased without any change in the physic solved or in the size of the mesh cell.With this mesh, scalability of permaFoam is maintained ideal, even with the use of 7000 cores (Fig. 8 (b)).Since the amount of cells per core increased in the 200M case compared to the 20M case, we observe as expected better parallel performances at high number of cores in the former than in the later.For the 200M test case, linear to super-linear behaviour is observed on all considered supercomputers for the whole range of used MPI processes, up to 16 384 on IRENE-ROME (Fig. 8 (c)).One should note that using a large amount of cores with a large mesh represents itself a particular challenge in terms of numerical resources required, especially for the mesh partition step.For instance 3 days of run are required to decompose the domain of the 200M mesh on 7000 MPI processes using the decomposePar OpenFOAM utility on a fat node with 3TB RAM (Intel© Skylake Xéon Platinum 8176@2.1GHz)available on Occigen supercomputer.
We also demonstrate the capacity of permaFoam to address huge sized problems, since we performed a scalability study up to 1800 cores on a 1billon cells grid (Fig. 8 (a)), with excellent stability and good scalability, with the only need to recompile OpenFOAM to allow the use of large integer numbers (label encoding with 64 bits).
Overall permaFoam, thanks to its development within Open-FOAM, allows to benefit from good parallel performances on different computing centers with different supercomputers.When considering big meshes, the scalability remains excellent even when considering the largest number of cores that are usable in practice for our applications (∼10 000) given the usual Quality Of Services (wall time) of the fat nodes available on the used supercomputers for preprocessing operations (especially mesh partitioning).

Conclusion and perspectives
In this paper we presented permaFoam, an OpenFOAM ® solver for permafrost hydrology.Thanks to the good parallel capabilities of OpenFOAM ® , and to the inclusion of the main processes involved, permaFoam allows to deal with permafrost modelling for large systems and on multi-annual time scales.It will be soon applied to the study of climate change impacts on the permafrost of several environmentally monitored boreal watersheds in the framework of the HiPerBorea project (hiperborea.omp.eu).One should note that such a tool may also be applied in other contexts where freeze/thaw in porous media occurs, such as for artificial ground freezing [1], geothermal energy [29] or geohazards in high altitude environments [36].

Declaration of competing interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Fig. 1 .
Fig. 1.Numerical strategy used for the resolution of the two non-linear, coupled equations of permafrost dynamics considered in permaFoam.

Fig. 2 .
Fig. 2. Seasonally variable thermal boundary condition used for modelling the North Aspected Slopes of the Kulingdakan watershed (see also section 4.1).

Fig. 4 .
Fig. 4. Temperature profiles in North and South Aspected Slopes -good agreement with field observations in terms of active layer depths.

Fig. 5 .Fig. 6 .
Fig. 5. Overview of the valley, model geometry and simulation maps: (a) view of the modelled valley with simulation domain boundaries, boundary conditions zonation (Shade, Mixed, River) and monitoring locations (F8, F4, F3, F13&14 monitoring temperature -VWC1 at high and low locations monitoring volumetric water content).(b) and (c) simulation maps for respectively temperature and volumetric liquid water content fields at mid-September.(For interpretation of the colours in the figure(s), the reader is referred to the web version of this article.)

Fig. 7 .
Fig. 7. Execution times obtained using different MPI decomposition strategies.Each curve corresponds to specific splitting direction(s) indicated on the curve label.Calculations have been performed on OLYMPE supercomputer (Intel© SKYLAKE processors).

Fig. 8 .
Fig. 8. Strong scaling of permaFoam on three different HPC centers.Ideal scalability is displayed with dashed line.Three meshes are tested, with 20M, 200M and 1G cells.Each scalability curve has its own reference core numbers.