The Parcels v2.0 Lagrangian framework: new field interpolation schemes

With the increasing amount of data produced by numerical ocean models, so increases the need for efficient tools to analyse these data. One of these tools is Lagrangian ocean analysis, where a set of virtual particles are released and their dynamics is integrated in time based on fields defining the ocean state, including the hydrodynamics and biogeochemistry if available. This popular methodology needs to adapt to the large variety of models producing these fields at different formats. This is precisely the aim of Parcels, a Lagrangian ocean analysis framework designed to combine (1) a wide flexibility to 5 model particles of different natures and (2) an efficient implementation in accordance with modern computing infrastructure. In the new Parcels v2.0, we implement a set of interpolation schemes to read various types of discretised fields, from rectilinear to curvilinear grids in the horizontal direction, from zto slevels in the vertical and using grid staggering with the Arakawa’s A-, Band Cgrids. In particular, we develop a new interpolation scheme for a three-dimensional curvilinear C-grid and analyse its properties. 10 Parcels v2.0 capabilities, including a suite of meta-field objects, are then illustrated in a brief study of the distribution of floating microplastic in the North West European continental shelf and its sensitivity to various physical processes.

, etc. It is used at a wide range of time and spatial scales as for example the modelling of plastic dispersion, from coastal applications (e.g. Critchell and Lambrechts, 2016) to regional (e.g. Kubota, 1994) or global scales (e.g. Maximenko et al., 2012).
The method consists in advancing, for each particle, the coordinates and other state variables by first interpolating fields of interest, as the velocity or any tracer, at the particle position and integrating in time the ordinary differential equations defining 5 the particle dynamics.
A number of tools are available to track virtual particles, with diverse characteristics, strengths and limitations including Ariane (Blanke and Raynaud, 1997), TRACMASS (Döös et al., 2017), CMS (Paris et al., 2013) and OpenDrift (Dagestad et al., 2018). An extensive list and description of Lagrangian analysis tools is provided in van Sebille et al. (2018). One of the tools is Parcels. 10 Parcels ("Probably A Really Computationally Efficient Lagrangian Simulator") is a framework for computing Lagrangian particle trajectories (http://www.oceanparcels.org, Lange and . The main goal of Parcels is to process the continuously increasing amount of data generated by the contemporary and future generations of ocean general circulation models (OGCMs). This requires two important features of the model: (1) not to be dependent on one single format of fields and (2) to be able to scale up efficiently to cope with up to petabytes of external data produced by OGCMs. In Lange and van 15 Sebille (2017), the concept of the model was described and the fundamentals of Parcels v0.9 were stated. Since this version, the model essence has remained the same, but many features were added or improved, leading to the current version 2.0. Among all the developments, our research has mainly focused to develop and implement into Parcels interpolation schemes to provide the possibility to use a set of fields discretised on various types of grids, from rectilinear to curvilinear in the horizontal direction, with z-or s-levels in the vertical, and using grid staggering with A-, B-and C-Arakawa staggered grids (Arakawa and Lamb, 20 1977). In particular an interpolation scheme for curvilinear C-grids, that was not defined in other Lagrangian analysis models, was developed for both z and s-levels.
In this paper, we detail the interpolation schemes implemented into Parcels, with a special care in the description of the new curvilinear C-grid interpolator. We describe the new meta-field objects available into Parcels for easier and faster simulations.
We prove some fundamental properties of the interpolation schemes. We then validate the new developments through a study of 25 the sensitivity of floating microplastic dispersion and 3D passive particles distribution on the North West European continental shelf and discuss the results.
2 Parcels v2.0 development To simulate particle transport in a large variety of applications, Parcels relies on two key features: (1) interpolation schemes to read external data sets provided on different formats and (2) customisable kernels to define the particle dynamics. 30 The interpolation schemes are necessary to obtain the field value at the particle 3D location. They have been vastly improved in this latest version 2.0. Section 2.1 describes the interpolation of fields and Section 2.2.1 their structure in Parcels.
The kernels are already available since the earliest version of Parcels (Lange and . They are the core of the model, integrating particle position and state variables. The built-in kernels and the development achieved are briefly discussed in Section 2.2.2.

Field Interpolation
External data sets are provided to Parcels as a set of fields. Each field is discretised on a structured grid that provides the 5 node locations and instants at which the field values are given. It is noteworthy that the fields in a field set are not necessarily based on the same grid. In the horizontal plane, rectilinear (Fig. 1a) and curvilinear (Fig. 1b) grids are implemented. Threedimensional data are built as the vertical extrusion of the horizontal grid, either using z-levels ( Fig. 1c) or s-levels (Fig. 1d).
A three-dimensional mesh is a combination of a rectilinear or a curvilinear in the horizontal direction mesh with either z-or s-levels in the vertical. Another class of meshes are the unstructured grids (e.g. Lambrechts et al., 2008), which are not yet 10 supported in Parcels.
Fields can be independent from each other (e.g. water velocity from one data set and wind stress from a different data set) and interpolated separately. Often, fields come from a same data set, for example when they result from a numerical model, and form a coherent structure, that must be preserved in Parcels; an example is the zonal and meridional components of the velocity field. A coherent field structure is discretised on the same grid, but the variables are not necessarily distributed evenly, 15 leading to a so-called staggered grid (Arakawa and Lamb, 1977). The various types of staggered grids have their pros and cons (Cushman-Roisin and Beckers, 2011). Parcels reads the more popular staggered grids occurring in geophysical fluid dynamics: A-, B-and C-grids. While A-and C-are fundamentally different (Fig. 2), the B-grid can be considered as a hybrid configuration in between A-and C-grids. D-and E-grids are less common and are not implemented into the framework so far. The methods to interpolate fields on the A-, C-and B-grids are presented in this section.
This grid is used in many reanalysis data sets for global currents (e.g. Globcurrent, Rio et al., 2014) or tidal dynamics (e.g. TPXO, Egbert and Erofeeva, 2002), as well as regridded products such as OFES (Sasaki et al., 2008) or data sets on platforms like the Copernicus Marine Environment Monitoring Service. In this section we consider first the two-dimensional case before 25 describing three-dimensional fields.

2D field
In a two-dimensional context, field f is interpolated in cell (j, i) where the particle is located, with the bi-linear Lagrange polynomials φ 2D n and the four nodal values F n with n = 0, ..., 3, surrounding the cell, resulting in the following expression: with ξ, η s.t.
(2) 5 ξ, η are the relative coordinates in the unit cell ( Fig. 3b), corresponding to the particle relative position in the physical cell ( Fig. 3a). (X n , Y n ) are the coordinates of the cell vertices. The two-dimensional Lagrange polynomials φ 2D n are the bi-linear functions: Note that in a rectilinear mesh, solving Eq. 2 reduces to the usual solutions: Arakawa's staggered grids (Arakawa and Lamb, 1977): (a) A-grid and (b) C-grid. In C-grid, i and j represent the variable column and row indexing in arrays where the variables are stored. The indexing of the C-grid follows the NEMO notations (Madec et al., 2016).

3D field
To read three-dimensional fields, for both z-and s-levels, the scheme interpolates the eight nodal values located on the hexahedron vertices using the tri-linear Lagrangian polynomials φ 3D n : ξ and η are obtained as in the 2D case, using the first four vertices of the hexahedron. The vertical relative coordinate is obtained as: The interpolation results in:

15
In a C-grid discretisation, the velocities are located on the cell edges and the tracers are at the middle of the cell (Fig. 2b).
One could attempt to interpolate zonal and meridional velocities as if they would be on two different A-grids, shifted from half a cell, but this would not be in accordance with the C-grid formulation: in particular, such interpolation would break impermeability at the coastal boundary conditions. Instead, we use the interpolation principle based on the scheme used in the Lagrangian model TRACMASS (Jönsson et al., 2015;Döös et al., 2017), but generalised to curvilinear meshes.

20
The tracer is computed as a constant value all over the cell, in accordance with the mass conservation schemes of C-grids.
The formulations for the two-dimensional and three-dimensional velocities consist of four steps: 1. define a mapping between the physical cell and a unit cell (as for the A-grid, Fig. 3); 2. compute the fluxes on the unit cell interfaces, as a function of the velocities on the physical cell interfaces; 3. interpolate those fluxes to obtain the relative velocity; 4. transform the relative velocity to the physical velocity.
Step 1 consists in applying Eq. 2 for the 2D case and Eqs. 2 and 4 for 3D fields. The other three steps are defined below, for both 2D and 3D fields.

2D field
The velocities at the edges of cell (j, i) are (u j+1,i , u j+1,i+1 , v j,i+1 , v j+1,i+1 ) (Fig. 2b). This indexing was chosen to be consistent with the notation used by the NEMO model (Madec et al., 2016). For readability, they will be renamed in this section to (u 0 , u 1 , v 0 , v 1 ), using local indices (Fig. 3a) instead of the global indices . The velocity at (x, y) is not obtained by linear interpolation but, like in finite-volume schemes, it is approximated by linearly interpolating the fluxes (U 0 , U 1 , V 0 , V 1 ) 5 through the cell edges (Fig. 3b), that read (Step 2): where L 03 , L 12 , L 01 , L 23 are the edge lengths. Secondly, J 2D (ξ, η), the Jacobian matrix of the transformation from the physical cell to the unit cell ( Fig. 3) is defined: (8) 10 The determinant of the Jacobian matrix, that will be called Jacobian is computed: J 2D (ξ, η) = det J 2D . The Jacobian defines the ratio between an elementary surface in the physical cell and the corresponding surface in the unit cell. The relative velocity in the unit cell is defined as (Step 3): Finally (Step 4), the velocity is obtained by transforming the relative velocity (Eq. 9) to the physical coordinate system:

3D field
The three-dimensional interpolation on C-grids is different for z-and s-levels.
For s-levels, the three velocities must be interpolated at once. The three-dimensional interpolation is similar to its twodimensional version, but it is not the straightforward extension, which would linearly interpolate the fluxes as in Eq. 9 and divide this result by the Jacobian. Indeed, in 2D the interpolation scheme is built specifically such that a uniform velocity field is exactly interpolated by Eq. 10 (see demonstration in Section 3.1). This is made possible since when developing the right 5 hand side of Eq. 10, one obtains a numerator which is a bi-linear function of ξ and η, and a denominator which is the Jacobian, precisely bi-linear in ξ and η. Doing a similar approach in 3D would lead to a tri-linear numerator, but the Jacobian J 3D is a tri-quadratic function of the coordinates ξ, η and ζ: The interpolation order must then be increased as well to a quadratic function. Here, as in 2D, the velocities u, v and w are still 10 derived from the coordinate transformation (Step 4): but this time the relative velocities interpolate the fluxes using quadratic Lagrangian functions (Step 3): Step 2 is more complex than in the 2D case. The fluxes interpolated in Eq. 14 are the product of the velocities with the face 15 Jacobian J 2D,f i , as illustrated in Fig. 4. The Jacobian J 2D,f i is defined as follow (Weisstein, 2018): with M j,i (J 3D ), the j, i minor of J 3D , i.e. the determinant of the Jacobian matrix J 3D from which row j and column i were removed, leading to a 2x2 matrix. Fluxes through a face that is normal to ξ, η or ζ in the unit cell are computed with Jacobian It is important to note another difference between the 2D and the 3D approaches here. While in 2D, the fluxes were simply the product of the velocity and the edge length and were independent from ξ and η, this is not the case in 3D anymore. The Jacobian is a function of the relative coordinates. This is because an interface will not be evenly distorted from the physical to 5 the unit cell. In fact, the 2D is a particular case of the 3D, where the edge length corresponds to the edge Jacobian, which is independent of the relative coordinates.
Finally, the velocities u 0 , u 1 , v 0 , v 1 , w 0 and w 1 are provided by the grid, but this is not the case for u1 /2 , v1 /2 , w1 /2 . For ocean applications under the Boussinesq approximation, mass conservation is reduced to volume conservation (Cushman-Roisin and Beckers, 2011), leading to the continuity equation. Using this equation, the flux through faces [12,13,14,15] (in blue on Fig. 4), [16,17,18,19] (in red) and [8,9,10,11] (in green) can be computed if the mesh is fixed. Otherwise, for example with vertically adaptive grids, the change in volume of the mesh must be considered, which is currently not implemented in Parcels. Ignoring 5 this term in vertically adaptive grids would lead to errors on the order of 1% (Kjellsson and Zanna, 2017).

B-grid
The B-grid is a combination between the A-and the C-grids. It is used by OGCMs such as MOM (Griffies et al., 2004) or 20 POP (Smith et al., 2010).
For two-dimensional fields, the velocity nodes are located on the cell vertices as in an A-grid and the tracer nodes are at the centre of the cells as in a C-grid. The velocity field is thus interpolated exactly as for an A-grid (Eq. 1), and the tracer is like in a C-grid, constant over the cell.
For a three-dimensional cell, the tracer node is still at the centre of the cell and the field is constant; the four horizontal 25 velocity nodes are located at the middle of the four vertical edges; the two vertical velocity nodes are located at the centre of the two horizontal faces (Wubs et al., 2006). The horizontal component of the interpolated velocity in the cell thus only varies as a function of ξ and η and the vertical component is a function of only ζ: with the local indices (u 0 , u 1 , u 2 , u 3 ) corresponding to global indices (u k,j,i , u k,j,i+1 , u k,j+1,i+1 , u k,j+1,i ) and similarly for the v field. w 0 and w 1 correspond to w k,j+1,i+1 and w k+1,j+1,i+1 and tracer t 0 to t k,j+1,i+1 . In Parcels v2.0, this field interpolation is only available for z-levels. Parcels relies on a set of Field objects, combined in a FieldSet, to interpolate different quantities at the particle location.
As explained in Section 2.1, a field is discretised on a grid. In Parcels v2.0, four grid objects are defined: RectilinearZGrid, be empty for steady state fields. Note that models such as Hycom using hybrid grids combining z-and σ-levels are treated by Parcels as general RectilinearSGrid or CurvilinearSGrid objects, for which the depth of the s-levels is provided.
A Field has an interp_method attribute, which is set to linear for fields discretised on a A-grid, bgrid_velocity, bgrid_w_velocity and bgrid_tracer for horizontal and vertical velocity and tracer fields on a B-grid and cgrid_velocity and cgrid_tracer for velocity and tracer on a C-grid. Note that a nearest interpolation method is also available, which 20 should not be used for B-and C-grids.
Parcels can load field data from various input formats. The most common approach consists in reading netCDF files using the FieldSet.from_netcdf() method or one of its derivatives. However, Python objects such as xarray or numpy arrays can also be loaded using FieldSet.from_xarray_dataset() or FieldSet.from_data(), respectively.
Loading a long time series of data often requires a significant memory allocation, which is not always available on the 25 computer. The previous Parcels version circumvented the problem by loading the data step by step. Using deferred_load flag which is set by default, this process is fully automated in v2.0 and allows to use long time series while under the hood the time steps of the data are loaded only when they are strictly necessary for computation.

Meta-field objects
In Parcels, a variety of other objects enable to easily read a field. In this section, we describe the new objects recently added to the framework.
The first object is the VectorField, that jointly interpolates the two or three components of a vector field such as velocity.
This object is not only convenient but necessary, since u and v fields are both required to interpolate the zonal and meridional 5 velocity in the C-grid curvilinear discretisation.
Another useful object is the SummedField, which is a list of fields that can be summed to create a new field. The fields of the SummedField do not necessarily share the same grid. For example, this object can be used to create a velocity which is the sum of surface water current and Stokes drift. This object has no other purpose than simplifying greatly the kernels defining the particle dynamics. 10 The fields do not necessarily have to cover the entire region of interest. If a field is interpolated outside its boundary, an ErrorOutOfBounds is raised, which leads to particle deletion except if this error is processed through an appropriate kernel. A sequence of various fields covering different regions, which may overlap or not, can be interpolated under the hood by Parcels with a NestedField. In this case, the fields composing the NestedField must represent the same physical quantity. The NestedField fields are ranked to set the priority order in which they must be interpolated.

15
Available data are not always provided with the expected units. The most frequent example is the velocity given in m s −1 while the particle position is in degrees. The same problem occurs with diffusivities in m 2 s −1 . UnitConverter objects allow to convert automatically the units of the data. The two examples mentioned above are defined into Parcels and other UnitConverter objects can be implemented by the user for other transformations.

20
The kernels define the particle dynamics (Lange and . Various built-in kernels are already available in Parcels. AdvectionRK4, AdvectionRK45 and AdvectionEE implement the Runge-Kutta 4, Runge-Kutta-Fehlberg and explicit Euler integration schemes for advection. While other explicit discrete time schemes can be defined, analytical integration schemes (Blanke and Raynaud, 1997;Chu and Fan, 2014) are not yet available in Parcels.
BrownianMotion2D and SpatiallyVaryingBrownianMotion2D implement different types of Brownian motion 25 available as kernels. Custom kernels can be defined by the user for an application-dependent dynamics.
The kernels are implemented in Python but are executed in C for efficiency (Lange and

even if a full
Python mode is also available. However, the automated translation of the kernels from Python to C somehow limits the freedom in the syntax of the kernels. For advanced kernels, the possibility to call a user-defined C library is available in version 2.0.

Uniform velocity on a 2D C-grid
In this section, we prove that the C-grid interpolation preserves exactly a uniform velocity in a quadrilateral. To do so, let us define a uniform velocity u = (u, v) and a quadrilateral with x coordinates [X 0 , X 1 , X 2 , X 3 ] and y coordinates On such an element, the velocities u 0 , u 1 , v 0 , v 1 are the scalar product of u and n, the unit vector normal to the edge, and the 5 fluxes U 0 , U 1 , V 0 , V 1 are the velocities multiplied by the edge lengths, leading to: 10 Therefore, developing Eq. 9 results in: and then applying Eq. 10: independently from ξ and η. For the 3D case, the same result is obtained numerically. It can be evaluated using the simple Python C-grid interpolator code available at https://doi.org/10.5281/zenodo.3253697.

z-and s-level C-grid compatibility
As mentioned above, the horizontal and vertical directions in grids using z-levels are completely decoupled, such that horizontal 20 velocity can be computed as for a 2D field, and vertical interpolation is computed linearly. But a z-level grid is a particular case of an s-level grid. We show that the 3D C-grid interpolator reduces to the simpler z-level C-grid when Z 0 = Z 1 = Z 2 = Z 3 and Z 4 = Z 5 = Z 6 = Z 7 .
First, for z-levels, it is noteworthy that: i ∂ξ X i = ∂x ∂ξ , 25 and similarly for ∂x ∂η , ∂y ∂ξ , ∂y ∂η . J 3D is: and J 3D = J 2D (Z 4 − Z 0 ). All the fluxes through the vertical faces reduce to the product of the velocity, the horizontal edge length and the element height, as for U 0 : The fluxes through the horizontal faces are: Therefore, the inner fluxes results in: and similarly for V1 /2 and W1 /2 . The first two lines of Eq. 13 reduce to Eq. 10 and the third line is: Finally using Eq. 19, Eq. 14 becomes: from which first two lines correspond to Eq. 9 and third combined with Eq. 20 reduces to Eq. 11.
4 Simulating the sensitivity of North West European continental shelf floating microplastic distribution 15 Microplastic (MP) is transported through all marine environments and has been observed in large quantities both at coastlines (e.g. Browne et al., 2011) and open seas (e.g. Barnes et al., 2009), at the surface and the sea bed. It represents potential risks to the marine ecosystem that cannot be ignored (Law, 2017). At a global scale, high concentrations are reported in the subtropics (Law et al., 2010) but also in the Arctic (Obbard et al., 2014). Various studies have already modelled the accumulation of MP in the Arctic (van Sebille et al., 2012(van Sebille et al., , 2015Cózar et al., 2017), highlighting the MP transport from the North 20 Atlantic and the North Sea. Meanwhile, at smaller scales, other studies have focused focused on marine litter in the Southern part of the North Sea (Neumann et al., 2014;Gutow et al., 2018;Stanev et al., 2019) and have included diffusion and wind drift to their model as well as used a higher resolution.
Here we study how the modelled accumulation of floating MP in the Arctic depends on the incorporation of physical processes and model resolutions used for the Southern part of the North Sea. Parcels is used to evaluate the sensitivity of 5 the floating MP distribution under those constrains. To do so, virtual floating MP particles are released off the Rhine and Thames estuaries and tracked for three years. The floating MP distribution is then compared with the trajectories of passive 3D particles, which are not restricted to stay at the sea surface. Note that this section is not meant as a comprehensive study of the MP transport off the North Sea, but rather an application of the new features implemented into Parcels, in both two and three dimensions.

Input data
We study the influence of the different physical processes impacting surface currents like density-and wind-driven currents, tidal residual currents and Stokes drift, but also the impact of mesh resolution and diffusion. The data come from various data sets (Fig. 5), described in this section. Links to access the data are provided in the code and data availability section.

15
The main data we use are ORCA0083-N006 and ORCA025-N006, which are standard sets-up from NEMO (Madec et al., 2016), an ocean circulation model forced by reanalysis and observed flux data: the Drakkar Forcing Set (Dussin et al., 2016).
The forcings consist of wind, heat and freshwater fluxes at the surface.
The data are available globally at resolutions of 1/4 • (ORCA025-N006) and 1/12 • (ORCA0083-N006). They are discretised on an ORCA grid (Madec and Imbard, 1996), a global ocean tripolar grid, which is curvilinear. The mesh is composed of 75 z-20 levels and the variables are positioned following a C-grid. The temporal resolution is 5 days.

North West shelf reanalysis
The North West shelf reanalysis (Mahdon et al., 2017) is an ocean circulation flow data set based on the Forecasting Ocean Assimilation Model 7 km Atlantic Margin Model, which is a coupling of NEMO for the ocean with the European Regional Seas Ecosystem Model (Blackford et al., 2004). The reanalysis contains tidal residual currents. 25 The data are freely available on the Copernicus Marine Environment Monitoring Service (CMEMS). They have a resolution of about 7 km (1/9 • lon x 1/15 • lat), from (40 • N, 20 • W) to (65 • N, 13 • E), with a temporal resolution of 1 day. The data are originally computed on a C-grid, but are re-interpolated on the tracer nodes to form an A-grid data set with z-levels, which is available on CMEMS.
The data, which will be referred to as NWS, do not cover the entire modelling region, such that a NestedField is used to 30 interpolate it within the available region (green zone in Fig. 5), and use the NEMO data for particles outside that region.

WaveWatch III
Stokes drift, i.e. the surface residual current due to waves, was obtained from WaveWatch III (Tolman et al., 2009), that was run using wind forcings from the NCEP Climate Forecast System Reanalysis (CFSR, Saha et al., 2010). The data have a spatial resolution of 1/2 • , extending until 80 • N, and a temporal resolution of 3 hours. Every day of year 2000, 100 particles are released in the mouth of the Thames estuary and 100 more particles in mouth of the Rhine, before being tracked for three years. For the two 2D runs, (a) to (e), the particles are released at the sea surface and follow the horizontal surface currents. For the 3D run (f), the particles are released at 0.1, 0.5 and 1 m depths and follow the 3D NEMO flow field.

Simulations
The diffusion, which parametrises the unresolved processes, is modelled as a stochastic zeroth-order Markov model (van 5 Sebille et al., 2018). The diffusion parameter is proportional to the mesh size, exactly as in the study of North Sea marine litter by Neumann et al. (2014): with D 0 = 1 m 2 s −1 the reference diffusivity, l the square root of the mesh size and l 0 = 1 km. This formulation leads to a same order of magnitude diffusivity as the constant value of Gutow et al. (2018). 10 The beaching of MP is non-negligible in the North Sea (Gutow et al., 2018) even if it is still poorly understood and often ignored (Neumann et al., 2014) or resulting from the low-resolution current and wind conditions (Gutow et al., 2018). Here we distinguish two flow types, the first based on NEMO and NWS data which has impermeable boundary conditions at the coast, and the second which includes Stokes drift and diffusion, thus allowing beaching.
For numerical reasons, due to the integration time step of 15 minutes and the Runge-Kutta 4 scheme, it is theoretically 15 possible that particles beach even with NEMO or NWS data. This could happen for example in a region of coastal downwelling, since the particles are forced to stay at the surface and could be constantly transported towards the coast. The particle dynamics is thus implemented using separate kernels. At each time step, the particle position is first updated following NEMO or NWS advection. Then the particle is checked to still be located in a wet cell, otherwise it is pushed back to the sea using an artificial current. In a second step, the Stokes or diffusion kernels are run, where if the particle beaches, it stops moving. In a final step, 20 the particle age is updated. The kernel code as well as all the scripts running and post-processing the simulations are available at https://doi.org/10.5281/zenodo.3253693.
To compare the simulations, the Parcels raw results, consisting of particle position, age and beaching status exported every two days, are post-processed into the following maps and budgets.
The particle density (Fig. 6) is computed as the number of particles per square kilometre, averaged over the third year of 25 particle age. Note that the absolute value of the concentration is not particularly meaningful, since it is simply proportional to the number of particles released.
To analyse the particle path, the ocean is discretised into cells of 1/4 • longitude x 1/8 • latitude resolution, and for each cell the fraction of particles that has visited at least once the cell is computed (Fig. 7).
To study the temporal dynamics of the particles, the region is divided into six zones (Fig. 5): North Sea, Skagerrak and 30 Kattegat, Baltic Sea, Norwegian Coast, Arctic Ocean and Atlantic Ocean, and the evolution of the distribution of the particles in those zones is computed (Fig. 8). The time axis represent the particle age in years.
Finally, the integrated vertical distribution (Fig. 9) of the particles as a function of the latitude is computed for the 3D run.
For this profile, the domain is divided in bins of 0.5 • latitude by 5 m depth and every two days, each particle is mapped to the cell it belongs to, leading to the integrated vertical distribution. Note the linear scale for the upper 50 m depth, and the logarithmic scale for the full vertical profile.
Animations showing the particle dynamics are available in the article supplementary materials.

Results
The results show various minor and major differences between the scenarios.

5
While NEMO 1/12 • and NEMO 1/4 • show similar dynamics for the first year (Fig. 8), the Norwegian fjords have a higher trapping role in the 1/4 • resolution run, even if the plastic does not beach in both runs. This increased particle trapping could be a consequence of the data lower resolution that results in a reduced horizontal velocity shear, while a strong shear layer behaves as a barrier isolating the coast from the open waters (Delandmeter et al., 2017). This is also observed in the supplementary materials animations, in which the main particle path is observed further from the coast for the NEMO 1/12 • 10 run. As a consequence of the trapping, the amount or MP reaching the Arctic is reduced in NEMO 1/4 • . This run also produces significantly lower densities North of 80 • N. Although none of the runs do resolve coastal dynamics, because of the low temporal and space resolution and the lack of tides, they show important differences. Since no validation was achieved for those MP simulations, there is no reason to argue that the 1/12 • resolution is high enough to simulate MP dynamics, but this resolution is similar to other studies of plastic litter in the region (Gutow et al., 2018).

15
The main differences of using NWS result from the dynamics during the first year, when the particles are located South of 65 • N, explaining the lack of differences in the Arctic region in Fig. 6a and Fig. 6c and Fig. 7a and Fig. 7c. The residence time in the North Sea is increased relatively to NEMO 1/12 • , and different peak events occur in the Skagerrak and Kattegat, resulting in final concentrations of 68% in the Arctic and 27% in the Norwegian coast, respectively 7% higher and 8% lower than with NEMO 1/12 • (Fig. 8).

20
Including Stokes drift has a major impact on MP dynamics in the North West European continental shelf, due to prevailing westerly winds (Gutow et al., 2018), with close to 90% of the plastic staying in the North Sea, 9% beaching in the Norwegian coast and less than 0.25% reaching the Arctic. Particles are beaching very quickly, with 90% in less than 3.5 months, and 99% within 10 months. While those numbers are not validated here, we can still point out that even if Stokes drift has an important contribution on surface dynamics for large scales (Onink et al., 2019), using it on smaller scales needs a proper validation. 25 Especially the boundary condition should be treated with care. It has a large impact in this application where the particles are released next to the coast.
The parametrisation of sub-grid scales and diffusion is still an important field of research in the Lagrangian community, but it is generally agreed that it cannot be neglected. In this application, we observe how adding diffusion impacts the fate of MP.
The amount of MP reaching the Arctic is reduced by 68% compared to NEMO 1/12 • , with large accumulation in the North

30
Sea and Norwegian coast, but not in the Skagerrak and Kattegat. Overall, the proportion of beached particles increases linearly to 73% during the first year, before slowly reaching 83% during the next two years.
Maintaining the MP at the surface is a strong assumption: biofouling, degradation and hydrodynamics affect the plastic depth, which impacts its lateral displacement. In the 3D particle run (Fig. 6f), we do not take all the processes driving MP  vertical dynamics, whose parametrisations are currently being developed by the community, but simply model the path of passive particles following the three-dimensional current. Passive particles do not accumulate along the coast as the floating MP (Fig. 6f). This results in an increased transport towards the Arctic (Figs. 7f and 8f). There are overall higher concentrations in the Arctic, including North of 80 • N and in the Eastern part of the Kara Sea. Around 1% of the passive particles end up in the Baltic Sea, crossing the Skagerrak and Kattegat using favourable deeper currents (Gustafsson, 1997), while such transport was 5 negligible for floating MP plastic. The vertical distribution of the particles is also analysed ( Fig. 9): the majority of the particle, originally released in shallow waters, is still observed close to the surface: 21%, 63% and 97% of the particle records are above 10 m, 100 m and 500 m depths, respectively. The concentration vertical gradient decreases while the latitude increases, with almost no gradient in the Polar region. Passive particles have a significantly different path then surface ones, highlighting the importance of understanding better the vertical dynamics of the plastic to improve the accuracy of its distribution modelling. 10 This brief study of the sensitivity of North Sea floating MP distribution is an illustration of how Parcels is used to gather and compare flow fields from a multitude of data sets in both two and three dimensions, which was made possible by the development of the different field interpolation schemes and meta-field objects. To validate the MP dynamics observed, it is essential to couple such numerical study with an extensive field study.
Parcels, a Lagrangian ocean analysis framework, was considerably improved since version 0.9, allowing to read data from multiple fields discretised on different grids and grid types. In particular, a new interpolation scheme for curvilinear C-grids was developed and implemented into Parcels v2.0. This article described this new interpolation as well as the other schemes available in Parcels, including A-, B-and C-staggered grids, rectilinear and curvilinear horizontal meshes and z-and s-vertical 5 levels. Numerous features were implemented, including meta field objects, which were described here.
Parcels v2.0 was used to simulate the dynamics of the North West European continental shelf floating microplastic, virtually released during one year off the Thames and Rhine estuaries, before drifting towards the Arctic, and the sensitivity of this transport to various physical processes and numerical choices such as mesh resolution and diffusion parametrisation. While those simulations do not provide a comprehensive study of microplastic dynamics in the area, they highlight key points to 10 consider and illustrate the interest of using Parcels for such modelling.
The next step in Parcels development will involve increasing the model efficiency and developing a fully parallel version of the Lagrangian framework.
Code and data availability.
-Parcels code: The code for Parcels is licensed under the MIT licence and is available through GitHub at https://www.github.com/ OceanParcels/parcels. The version 2.0 described here is archived at Zenodo at https://doi.org/10.5281/zenodo.3257432. More information is available on the project webpage at http://www.oceanparcels.org.
-Interpolation code: Independently from Parcels, a simple Python code is also implementing all the C-grid interpolation schemes developed in this paper. It is available at https://doi.org/10.5281/zenodo.3253697.
-North Sea floating MP simulations: All the scripts running and post-processing the North Sea MP simulations are available at https: 20 //doi.org/10.5281/zenodo.3253693.
-NEMO data: The NEMO N006 data are kindly provided by Andrew Coward at NOC Southampton, UK, and can be downloaded at http://opendap4gws.jasmin.ac.uk/thredds/nemo/root/catalog.html.