Skip to content

Atmospheric-Dynamics-GUF/PinCFlow.jl

PinCFlow.jl: An idealized-atmospheric-flow solver coupled to the 3D transient gravity-wave model MS-GWaM

Docs-stable Docs-dev Build Status Codecov Coveralls License: MIT DOI

Introduction

PinCFlow.jl (Pseudo-inCompressible Flow solver) is an atmospheric-flow solver that was designed for conducting idealized simulations. It integrates the Boussinesq, pseudo-incompressible and compressible equations in a conservative flux form (Klein, 2009; Rieper et al., 2013), using a semi-implicit method that combines explicit and implicit time-stepping schemes (Benacchio & Klein, 2019; Schmid et al., 2021; Chew et al., 2022). Spatially, the equations are discretized with a finite-volume method, such that all quantities are represented by averages over grid cells and fluxes are computed on the respective cell interfaces. The grid is staggered (Arakawa & Lamb, 1977) so that the velocity components are defined at the same points as the corresponding fluxes of scalar quantities. PinCFlow.jl operates in a vertically stretched terrain-following coordinate system based on Gal-Chen and Somerville (1975a), Gal-Chen and Somerville (1975b) and Clark (1977).

The Lagrangian gravity-wave parameterization MS-GWaM (Multi-Scale Gravity-Wave Model) is interactively coupled to the dynamical core of PinCFlow.jl, so that unresolved gravity waves may be parameterized in a manner that accounts for transient wave-mean-flow interaction and horizontal wave propagation. The resolved fields are updated with tendencies computed by MS-GWaM at the beginning of every time step. A description of the theory behind MS-GWaM can be found in Achatz et al. (2017) and Achatz et al. (2023). For a numerical perspective and more information on the development, see Muraschko et al. (2014), Boeloeni et al. (2016), Wilhelm et al. (2018), Wei et al. (2019) and Jochum et al. (2025).

User guide

Installation

To install PinCFlow.jl, first make sure you have installed Julia. You can then run

using Pkg

Pkg.add("PinCFlow")

in the Julia REPL to add PinCFlow.jl to your current project environment.

Running the model

As a minimal example,

using PinCFlow

integrate(Namelists())

runs PinCFlow.jl in its default configuration. This simulation will finish almost instantly and won't produce particularly interesting results, since PinCFlow.jl simply initializes a $1 \times 1 \times 1 \ \mathrm{km^3}$ isothermal atmosphere at rest with a single grid cell and integrates the pseudo-incompressible equations over one hour. A more complex configuration can be set up by constructing namelists with changed parameters. This is illustrated in PinCFlow.jl's example functions, which are defined in its Examples module. To use the functions, we recommend downloading the examples folder from the repository and running

Pkg.activate("examples")
Pkg.instantiate()

to install the dependencies of its project. Having done this, you can easily run any of the example simulations without needing to worry about extra packages. For instance, calling the function

# src/Examples/periodic_hill.jl

function periodic_hill(;
    x_size::Integer = 40,
    z_size::Integer = 40,
    npx::Integer = 1,
    npz::Integer = 1,
    output_file::AbstractString = "periodic_hill.h5",
    prepare_restart::Bool = false,
    visualize::Bool = true,
    plot_file::AbstractString = "examples/results/periodic_hill.svg",
)
    h0 = 500.0
    l0 = 10000.0

    lz = 20000.0
    zr = 10000.0

    atmosphere = AtmosphereNamelist(;
        model = :Boussinesq,
        background = :StableStratification,
        coriolis_frequency = 0.0,
        initial_u = (x, y, z) -> 10.0,
    )

    domain = DomainNamelist(; x_size, z_size, lx = 20000.0, lz, npx, npz)

    grid = GridNamelist(;
        resolved_topography = (x, y) -> h0 / 2 * (1 + cos(pi / l0 * x)),
    )

    output =
        OutputNamelist(; output_file, output_variables = [:w], prepare_restart)

    sponge = SpongeNamelist(;
        rhs_sponge = (x, y, z, t, dt) ->
            z >= zr ? sin(pi / 2 * (z - zr) / (lz - zr))^2 / dt : 0.0,
    )

    integrate(Namelists(; atmosphere, domain, grid, output, sponge))

    if visualize && MPI.Comm_rank(MPI.COMM_WORLD) == 0
        h5open(output_file) do data
            plot_output(plot_file, data, ("w", 1, 1, 1, 2))
            return
        end
    end

    return
end

with

using PinCFlow, CairoMakie

periodic_hill()

performs a 2D simulation with an initial wind of $10 \ \mathrm{m \ s^{- 1}}$ that generates a mountain wave above a periodic hill, and visualizes the results.

PinCFlow.jl uses parallel HDF5 to write simulation data. By default, the path to the output file is pincflow_output.h5. This may be changed by setting the parameter output_file of the output namelist accordingly (as illustrated above). The dimensions of most output fields are (in order) $\hat{x}$ (zonal axis), $\hat{y}$ (meridional axis), $\hat{z}$ (axis orthogonal to the vertical coordinate surfaces) and $t$ (time). Ray-volume-property fields differ slightly in that they have an additional dimension in front and a vertical dimension that includes the first ghost layer below the surface. To specify which fields are to be written, set the parameters output_variables, save_ray_volumes and prepare_restart of the output namelist accordingly. A description of all namelists and their parameters is provided in the "Reference" section of the documentation.

For the visualization of simulation results, we recommend using Makie.jl with the CairoMakie backend. PinCFlow.jl has an extension which exports a few convenience functions if CairoMakie is loaded. This is utilized in the above function, yielding a plot of the vertical wind at the end of the simulation (see below).

If you want to run PinCFlow.jl in parallel, make sure you are using the correct backends for MPI.jl and HDF5.jl. By default, the two packages use JLL backends that have been automatically installed. If you want to keep this setting, you only need to make sure to use the correct MPI binary (specifically not that of a default MPI installation on your system). For example, by executing

mpiexec=$(julia --project=examples -e 'using MPICH_jll; print(MPICH_jll.mpiexec_path)')
${mpiexec} -n 9 julia --project=examples -e 'using PinCFlow, CairoMakie; periodic_hill(; npx = 3, npz = 3)'

in your shell, you can run the above simulation in 9 MPI processes. Note that npx and npz configure the number of MPI subdomains in $\hat{x}$ and $\hat{z}$, respectively. Thus, npx * npz must be equal to the number of processes, otherwise PinCFlow.jl will throw an error.

If you plan to run PinCFlow.jl on a cluster, you may want to consider using a provided MPI installation as backend. In that case, the MPI preferences need to be updated accordingly and the HDF5 backend has to be set to a library that has been installed with parallel support, using the chosen MPI installation. This can be done by running

julia --project=examples -e 'using MPIPreferences; MPIPreferences.use_system_binary(; library_names = ["/path/to/mpi/library/"])'
julia --project=examples -e 'using HDF5; HDF5.API.set_libraries!("/path/to/libhdf5.so", "/path/to/libhdf5_hl.so")'

with the paths set appropriately (more details can be found in the documentations of MPI.jl and HDF5.jl). Note that this configuration will be saved in examples/LocalPreferences.toml, so that the new backends will be used by all future scripts run in this project. By running

julia --project=examples -e 'using MPIPreferences; MPIPreferences.use_jll_binary()'
julia --project=examples -e 'using HDF5; HDF5.API.set_libraries!()'

you can restore the default backends. Having configured MPI.jl and HDF5.jl to use installations on your system, you can run

mpiexec -n 9 julia --project=examples -e 'using PinCFlow, CairoMakie; periodic_hill(; npx = 3, npz = 3)'

with mpiexec being your chosen system binary. For users who would like to run PinCFlow.jl on Levante, shell-script examples are provided in the folder examples/levante of the repository.

List of publications

  1. Initial flow solver: Rieper et al. (2013)

  2. Initial gravity-wave scheme: Muraschko et al. (2014)

  3. Gravity-wave breaking scheme: Boeloeni et al. (2016)

  4. Gravity-wave theory: Achatz et al. (2017)

  5. Coupling of the flow solver and gravity-wave scheme: Wilhelm et al. (2018)

  6. Horizontal propagation and direct approach in the gravity-wave scheme: Wei et al. (2019)

  7. Semi-implicit time scheme: Schmid et al. (2021)

  8. Extended gravity-wave theory: Achatz et al. (2023)

  9. Terrain-following coordinates & orographic source: Jochum et al. (2025)

About

An idealized-atmospheric-flow solver coupled to the 3D transient gravity-wave model MS-GWaM

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Contributors

Languages