This repository contains a low-surface-brightness galaxy detection pipeline for the Hyper Suprime-Cam Subaru Strategic Program (HSC-SSP). As described in this paper, hugs was used to discover ~800 low-surface-brightness galaxies within the first ~200 square degrees of the HSC-SSP.
- numpy
- scipy
- pandas
- astropy
- LSST Science Pipelines (for image processing)
- SExtractor version 2.19.5 (for source detection)
- sqlalchemy (for pipeline database)
- schwimmbad (for parallel processing)
- sfdmap (for extinction corrections)
If you have access to the calibrated exposures from a complete HSC rerun (e.g., at Mitaka, IAA, or Princeton), which can be called by the data butler, you can use the runner.py
script to run the hugs pipeline on a single patch:
python scripts/runner.py -t $tract -p $patch
or a list of patches stored in a csv file:
python scripts/runner.py --patches_fn $filename
The pipeline parameters are given in a yaml configuration file. The default config file is here. The rerun directory is given by the data_dir
parameter. You can pass a custom config file to runner.py
with the --config_fn
command line argument.
The pipeline builds a catalog as a sqlite database using sqlalchemy. The database will be created within the hugs_io
directory, which is specified in the config file. The database model file is here.
The pipeline can be run in parallel mode using the --mpi
(for mpi) or --ncores
(for multiprocessing) arguments.
If you are working with HSC-SSP images, the LSST stack provides an extremely useful suite of image processing tools. hugs uses the LSST codebase to "clean" images (i.e., remove bright sources and their associated diffuse light) and SExtractor to extract sources. For an example of how to use hugs on public HSC images, check out this notebook.