StagBL: A Scalable, Portable, High-Performance Discretization and Solver Layer for Geodynamic Simulation

PI: Paul J. Tackley (ETH Zurich)

Co-PIs: Taras Gerya (ETH Zurich), Boris Kaus (JGU Mainz), Dave A. May (University of Oxford)

July 1, 2017 - June 30, 2020

Project Summary

We propose StagBL, a common solver and discretization library for geodynamic simulation. By working with three leading application codes, StagYY, I3ELVIS, and LaMEM, this library will allow scalable, portable performance to quickly produce cutting-edge scientific results, thus establishing itself as a standard to be adopted more widely in the community, and allow a broader range of codes to scale to hundreds of thousands of processors, as LaMEM currently does. By implementing kernels used by many research groups beneath a staggered grid abstraction, library developers will be free to optimize these for modern hardware, thus reducing the barrier to large- or extreme-scale simulation across the field. We identify ambitious scientific goals which will proceed concurrently with development, along with lower-risk performance deliverables. We thus ensure an agile development environment which will directly advance domain science while providing a common backend, optimized and extensible for emerging architectures.

The process by which the Earth and other planets form and evolve is of great scientific interest, but direct simulation of this process is a grand challenge. A central modeling task is to evolve the state of a viscous fluid representing slowly-deforming rock in the mantle and lithosphere.

The complexity of the system, hence number of modeling choices, combined with the large spatial and temporal resolutions required to answer questions of fundamental interest (the mantle and lithosphere of an entire planet over billions of years) creates a daunting computational task. Much research is done in-house with rotating collaborators, and teams usually cannot develop reusable software components which can fully make use of modern supercomputers.

Put another way, there is efficiency to be gained with a better interconnect between the researcher and supercomputer. Supercomputing centers offer huge computational resources, but most geodynamics researchers do not use them on a continuous basis. Geodynamics researchers have a wealth of relevant scientific questions, yet often find it practically arduous to answer them when large-scale 3D modeling is required. This “bottleneck” can be mitigated with software development.

We propose the development of StagBL (“Staggered Grid Base Layer”), a common discretization and solver layer based on staggered grid finite-difference discretizations of Stokes flow. It will interface with three “production” geodynamics codes, StagYY, I3ELVIS, and LaMEM. This common library can be optimized, tested, and improved as several research groups use it. It will allow many codes to scale to new problem sizes for the first time, thus immediately opening new avenues of research. Amongst other benefits, it will allow a user to easily transition between local machines, smaller clusters, and supercomputing systems like those at CSCS. It will apply the results of the GeoPC PASC Co-design Project to produce an enduring community component.

Open questions related to understanding the solidification of planets from magma oceans, the formation of continents and plate tectonics, and other cutting-edge topics can only be resolved with large 3D simulation. The undeniably efficacy of the combination of a staggered finite difference (finite volume) grid, coupled to a particle-in-cell (PIC) advection scheme, justifies the development of a common, high performance framework to allow researchers to access the largest simulation sizes.

StagBL centers around an abstraction of a parallel staggered finite difference grid. This central object will provide automatic parallel decomposition, refinement, construction of multigrid transfer operators, access to standard and modern boundary conditions, and low-level operations related to particle advection. In addition, it provides Stokes and temperature operators and smoothers which interact with the staggered grid and several solver options.

An essential component of the proposal is that the deliverables include not just a working piece of software, but a set of portable default solver settings that should “just work” for users. In particular, we envision a set of three well-defined presets for use with the library, “small”, “medium”, and “large”. “Small” problems are those amenable to solution with direct solvers. The first stages of exploration of new physical processes with the application codes almost always begins with small problems, often with exploratory parameters; hence, the robustness of linear solvers is ideal, despite their non-scalable time and memory requirements. “Large” problems are those for which well-scaling multi-level solvers are essential. Scalable solvers are, again almost by definition, sensitive to particulars of the simulation scenario. Thus, we will provide not only reasonable default settings and examples to work from, but simple and easy-to-use analysis tools to identify more specifically why a multigrid method fails to converge. We also explicitly consider the often-ignored “medium”-sized problem, here defined as one which does not clearly prioritize scalability or robustness. We intend to provide this option via preconditioned Krylov methods for monolithic saddle-point matrices. Furthermore, we consider examples and documentation to be first-class deliverables. We provide a concrete path from beginner to supercomputer user by including an implementation of a minimal application code, StagBLDemo.

StagBL will be designed to be extensible, allowing for the introduction of novel new techniques as they are developed. In this proposal, we include implementation of a recently-published free surface boundary condition. Additional extensibility is provided by the optional interface to the PETSc library, which itself is becoming a platform for novel solvers to be quickly experimented with.