Skip to content

Celeritas is a new Monte Carlo transport code designed to accelerate scientific discovery in high energy physics by improving detector simulation throughput and energy efficiency using GPUs.

License

Notifications You must be signed in to change notification settings

lebuller/celeritas

 
 

Repository files navigation

Celeritas

The Celeritas project implements HEP detector physics on GPU accelerator hardware with the ultimate goal of supporting the massive computational requirements of the HL-LHC upgrade.

Documentation

Most of the Celeritas documentation is readable through the codebase through a combination of static RST documentation and Doxygen-markup comments in the source code itself. The full Celeritas user documentation (including selected code documentation incorporated by Breathe) and the Celeritas code documentation are mirrored on our GitHub pages site. You can generate these yourself (if the necessary prerequisites are installed) by setting the CELERITAS_BUILD_DOCS=ON configuration option and running ninja doc (user) or ninja doxygen (developer).

Installation for applications

The easiest way to install Celeritas as a library/app is with Spack:

  • Follow the first two steps above to install Spack and set up its CUDA usage.
  • Install Celeritas with spack install celeritas
  • Use spack load celeritas to add the installation to your PATH.

To install a GPU-enabled Celeritas build, you might have to make sure that VecGeom is also built with CUDA support if installing celeritas+vecgeom, which is the default geometry. To do so, set the following configuration:

# Replace cuda_arch=80 with your target architecture
$ spack config add packages:vecgeom:variants:"cxxstd=17 +cuda cuda_arch=80"
$ spack install celeritas +cuda cuda_arch=80

Then see the "Downstream usage as a library" section of the installation documentation for how to use Celeritas in your application or framework.

Installation for developers

Since Celeritas is still under heavy development and is not yet full-featured for downstream integration, you are likely installing it for development purposes. The installation documentation has a complete description of the code's dependencies and installation process for development.

As an example, if you have the Spack package manager installed and want to do development on a CUDA system with Volta-class graphics cards, execute the following steps from within the cloned Celeritas source directory:

# Set up CUDA (optional)
$ spack external find cuda
# Install celeritas dependencies
$ spack env create celeritas scripts/spack.yaml
$ spack env activate celeritas
$ spack config add packages:all:variants:"cxxstd=17 +cuda cuda_arch=70"
$ spack install
# Configure, build, and test
$ ./build.sh base

If you don't use Spack but have all the dependencies you want (Geant4, googletest, VecGeom, etc.) in your CMAKE_PREFIX_PATH, you can configure and build Celeritas as you would any other project:

$ mkdir build && cd build
$ cmake ..
$ make && ctest

Celeritas guarantees full compatibility and correctness only on the combinations of compilers and dependencies tested under continuous integration:

  • Compilers:
    • GCC 8.4, 12.3
    • Clang 10.0, 15.0
    • GCC 11.3 + NVCC 11.8
    • HIP-Clang 15.0
  • Dependencies:
    • Geant4 11.0.3
    • VecGeom 1.2.5

Partial compatibility and correctness is available for an extended range of Geant4:

  • 10.5-10.7: no support for tracking manager offload
  • 11.0: no support for fast simulation offload
  • 11.1-11.2: [no support for default Rayleigh scattering cross section](see celeritas-project#1091)

Since we compile with extra warning flags and avoid non-portable code, most other compilers should work. The full set of configurations is viewable on CI platforms (Jenkins and GitHub Actions). Compatibility fixes that do not cause newer versions to fail are welcome.

Development

See the contribution guide for the contribution process, the development guidelines for further details on coding in Celeritas, and the administration guidelines for community standards and roles.

Directory structure

Directory Description
app Source code for installed executable applications
cmake Implementation code for CMake build configuration
doc Code documentation and manual
example Example applications and input files
external Automatically fetched external CMake dependencies
interface Wrapper interfaces to Celeritas library functions
scripts Development and continuous integration helper scripts
src Library source code
test Unit tests

Citing Celeritas

If using Celeritas in your work, we ask that you cite the code using its DOECode registration:

Seth R. Johnson, Amanda Lund, Soon Yung Jun, Stefano Tognini, Guilherme Lima, Paul Romano, Philippe Canal, Ben Morgan, and Tom Evans. “Celeritas,” July 2022. https://doi.org/10.11578/dc.20221011.1.

A continually evolving list of works authored by (or with content authored by) core team members is available in our citation file.

About

Celeritas is a new Monte Carlo transport code designed to accelerate scientific discovery in high energy physics by improving detector simulation throughput and energy efficiency using GPUs.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 93.5%
  • Cuda 2.7%
  • CMake 2.5%
  • Python 0.6%
  • Jupyter Notebook 0.3%
  • Shell 0.2%
  • Other 0.2%