Skip to content

Commit

Permalink
updated documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
johannesulf committed May 19, 2024
1 parent ea4538b commit 9cd8559
Show file tree
Hide file tree
Showing 19 changed files with 115 additions and 300 deletions.
2 changes: 1 addition & 1 deletion docs/api/helpers.txt → docs/api/helpers.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
dsigma.helpers module
---------------------
=====================

.. automodule:: dsigma.helpers
:members:
Expand Down
2 changes: 1 addition & 1 deletion docs/api/jackknife.txt → docs/api/jackknife.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
dsigma.jackknife module
-----------------------
=======================

.. automodule:: dsigma.jackknife
:members:
Expand Down
2 changes: 1 addition & 1 deletion docs/api/physics.txt → docs/api/physics.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
dsigma.physics module
---------------------
=====================

.. automodule:: dsigma.physics
:members:
Expand Down
2 changes: 1 addition & 1 deletion docs/api/precompute.txt → docs/api/precompute.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
dsigma.precompute module
------------------------
========================

.. automodule:: dsigma.precompute
:members:
Expand Down
2 changes: 1 addition & 1 deletion docs/api/stacking.txt → docs/api/stacking.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
dsigma.stacking module
----------------------
======================

.. automodule:: dsigma.stacking
:members:
Expand Down
2 changes: 1 addition & 1 deletion docs/api/surveys.txt → docs/api/surveys.rst
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
dsigma.surveys module
---------------------
=====================

.. automodule:: dsigma.surveys.cfhtls
:members:
Expand Down
22 changes: 6 additions & 16 deletions docs/application_des.txt → docs/application_des.rst
Original file line number Diff line number Diff line change
@@ -1,15 +1,13 @@
########################
Dark Energy Survey (DES)
########################
========================

.. note::
This guide has not been inspected or endorsed by the DES collaboration.

Here, we we will show how to cross-correlate BOSS lens galaxies with shape catalogs from DES. We will work with the Y3 data release. Check out the documentation of :code:`dsigma` v0.6 if you want to see how to reduce DES Y1 data.

********************
Downloading the Data
********************
--------------------

DES Y3 data can be downloaded `here <https://desdr-server.ncsa.illinois.edu/despublic/y3a2_files/y3kp_cats/>`_. The following command should download all the necessary data.

Expand Down Expand Up @@ -70,10 +68,8 @@ Unfortunately, the total amount of data is very large, i.e. hundreds of GBytes.
The final file, :code:`des_y3.hdf5`, is also available from the :code:`dsigma` authors upon request.


******************
Preparing the Data
******************
------------------

First, we must put the data into a format easily understandable by :code:`dsigma`. There are several helper functions to make this easy. Additionally, we want to use the :math:`n(z)`'s provided by DES Y3 to correct for photometric redshift biases.

Expand Down Expand Up @@ -101,10 +97,8 @@ After running this selection response calculation, we are ready to drop all gala
table_n = Table.read('des_y3.hdf5', path='redshift')

***********************
Precomputing the Signal
***********************
-----------------------

We will now run the computationally expensive precomputation phase. Here, we first define the lens-source separation cuts. We require that :math:`z_l + 0.1 < z_{t, \rm low}` where :math:`z_{t, \rm low}` is the lower redshift bin edge of the tomographic bin `(Myles et al., 2021) <https://ui.adsabs.harvard.edu/abs/2021MNRAS.505.4249M>`_ each source galaxy belongs to. Afterward, we run the actual precomputation.

Expand All @@ -122,10 +116,8 @@ We will now run the computationally expensive precomputation phase. Here, we fir
precompute(table_r, table_s, rp_bins, cosmology=Planck15, comoving=True,
table_c=table_c, lens_source_cut=0.1, progress_bar=True)

*******************
Stacking the Signal
*******************
-------------------

The total galaxy-galaxy lensing signal can be obtained with the following code. It first filters out all BOSS galaxies for which we couldn't find any source galaxy nearby. Then we divide it into jackknife samples that we will later use to estimate uncertainties. Finally, we stack the lensing signal in 4 different BOSS redshift bins and save the data.

Expand Down Expand Up @@ -163,9 +155,7 @@ We choose to include all the necessary corrections factors. In addition to the m
result.write('des_{}.csv'.format(lens_bin), overwrite=True)

****************
Acknowledgements
****************
----------------

When using the above data and algorithms, please to read and follow the acknowledgement section on the `DES Y3 data release site <https://des.ncsa.illinois.edu/releases/y3a2>`_.
23 changes: 6 additions & 17 deletions docs/application_hsc.txt → docs/application_hsc.rst
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
#######################
Hyper Suprime-Cam (HSC)
#######################
=======================

.. note::
This guide has not been inspected or endorsed by the HSC collaboration.

This tutorial will teach us how to cross-correlate BOSS lens galaxies with lensing catalogs from the HSC survey.


********************
Downloading the Data
********************
--------------------

First, we need to download the necessary HSC data files. Head to the `HSC data release site <https://hsc-release.mtk.nao.ac.jp/doc/>`_ and register for an account if you haven't done so already. As of September 2020, the only publicly available data set is part of the public data release 2 (PDR2) and goes back to the internal S16A release.

Expand Down Expand Up @@ -40,10 +37,8 @@ As you can see, we will use the Ephor Afterburner photometric redshifts in this

In addition to the source catalog, we need a calibration catalog to correct for eventual biases stemming from using shallow photometric redshift point estimates. The relevant files can be downloaded using the following links: `1 <https://hsc-release.mtk.nao.ac.jp/archive/filetree/cosmos_photoz_catalog_reweighted_to_s16a_shape_catalog/Afterburner_reweighted_COSMOS_photoz_FDFC.fits>`_, `2 <https://hsc-release.mtk.nao.ac.jp/archive/filetree/cosmos_photoz_catalog_reweighted_to_s16a_shape_catalog/ephor_ab/pdf-s17a_wide-9812.cat.fits>`_,`3 <https://hsc-release.mtk.nao.ac.jp/archive/filetree/cosmos_photoz_catalog_reweighted_to_s16a_shape_catalog/ephor_ab/pdf-s17a_wide-9813.cat.fits>`_.


******************
Preparing the Data
******************
------------------

First, we must put the data into a format easily understandable by :code:`dsigma`. There are several helper functions to make this easy.

Expand All @@ -67,10 +62,8 @@ First, we must put the data into a format easily understandable by :code:`dsigma
table_c = dsigma_table(table_c, 'calibration', w_sys='SOM_weight',
w='weight_source', z_true='COSMOS_photoz', survey='HSC')

***********************
Precomputing the Signal
***********************
-----------------------

We will now run the computationally expensive precomputation phase. Here, we first define the lens-source separation cuts. We require that :math:`z_l < z_{s, \rm min}` and :math:`z_l + 0.1 < z_s`. Afterward, we run the actual precomputation.

Expand All @@ -85,10 +78,8 @@ We will now run the computationally expensive precomputation phase. Here, we fir
precompute(table_r, table_s, rp_bins, cosmology=Planck15, comoving=True,
table_c=table_c, lens_source_cut=0.1, progress_bar=True)

*******************
Stacking the Signal
*******************
-------------------

The total galaxy-galaxy lensing signal can be obtained with the following code. It first filters out all BOSS galaxies for which we couldn't find any source galaxy nearby. Then we divide it into jackknife samples that we will later use to estimate uncertainties. Finally, we stack the lensing signal in 4 different BOSS redshift bins and save the data.

Expand Down Expand Up @@ -130,9 +121,7 @@ We choose to include all the necessary corrections factors. The shear responsivi
result.write('hsc_{}.csv'.format(lens_bin))

****************
Acknowledgements
****************
----------------

When using the above data and algorithms, please make sure to cite `Mandelbaum et al. (2018a) <https://ui.adsabs.harvard.edu/abs/2018PASJ...70S..25M/abstract>`_ and `Mandelbaum et al. (2018b) <https://ui.adsabs.harvard.edu/abs/2018MNRAS.481.3170M/abstract>`_.
7 changes: 2 additions & 5 deletions docs/application_intro.txt → docs/application_intro.rst
Original file line number Diff line number Diff line change
@@ -1,13 +1,10 @@
############
Introduction
############
============

We will discuss various examples of calculating galaxy-galaxy lensing signals in the following. Precisely, we will calculate galaxy-galaxy lensing signals for the three extensive imaging surveys: the Dark Energy Survey (DES), the Hyper Suprime-Cam (HSC) survey, and the Kilo-Degree Survey (KiDS). We will use galaxies from the Baryon Oscillation Spectroscopic Survey (BOSS) as lenses. Specifically, we will qualitatively reproduce the results of the Lensing Without Borders project. To this end, we will cross-correlate the same sets of BOSS lens galaxies with different imaging surveys. If everything works correctly, the lensing amplitude :math:`\Delta\Sigma` for the same lens samples should be consistent between the various imaging surveys.


*****************
BOSS Lens Catalog
*****************
-----------------

The BOSS target catalogs are publicly available from the `SDSS data server <https://data.sdss.org/sas/dr12/boss/lss/>`_. In the following, we will assume that all relevant lens
(:code:`galaxy_DR12v5_CMASSLOWZTOT_*.fits.gz`) and random files (:code:`random0_DR12v5_CMASSLOWZTOT_*.fits.gz`) are in the working directory. The following code reads the data and puts it in a format easily understandable by :code:`dsigma`.
Expand Down
23 changes: 6 additions & 17 deletions docs/application_kids.txt → docs/application_kids.rst
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
#########################
Kilo-Degree Survey (KiDS)
#########################
=========================

.. note::
This guide has not been inspected or endorsed by the KiDS collaboration.

This tutorial will teach us how to cross-correlate BOSS lens galaxies with source catalogs from KiDS. We will work with the 4th data release (KiDS-1000).


********************
Downloading the Data
********************
--------------------

First, we need to download the necessary KiDS data files. The following commands should download all the required data.

Expand All @@ -20,10 +17,8 @@ First, we need to download the necessary KiDS data files. The following commands
wget http://kids.strw.leidenuniv.nl/DR4/data_files/KiDS1000_SOM_N_of_Z.tar.gz
tar -xvf KiDS1000_SOM_N_of_Z.tar.gz --strip 1

******************
Preparing the Data
******************
------------------

First, we must put the data into a format easily understandable by :code:`dsigma`. There are several helper functions to make this easy. Additionally, we want to use the :math:`n(z)`'s provided by KiDS to correct for photometric redshift biases. Thus, we also bin the source galaxies by photometric redshift and read the source redshift distribution in each photometric redshift bin.

Expand Down Expand Up @@ -52,10 +47,8 @@ First, we must put the data into a format easily understandable by :code:`dsigma
table_n['n'] = np.vstack(
[np.genfromtxt(fname.format(i + 1))[:, 1] for i in range(5)]).T

***********************
Precomputing the Signal
***********************
-----------------------

We will now run the computationally expensive precomputation phase. Here, we first define the lens-source separation cuts. We require that :math:`z_l + 0.1 < z_{t, \rm min}` where :math:`z_{t, \rm min}` is the minimum redshift of the tomographic bin each source galaxy belongs to. Afterward, we run the actual precomputation.

Expand All @@ -72,10 +65,8 @@ We will now run the computationally expensive precomputation phase. Here, we fir
precompute(table_r, table_s, rp_bins, cosmology=Planck15, comoving=True,
table_n=table_n, lens_source_cut=0.1, progress_bar=True)

*******************
Stacking the Signal
*******************
-------------------

The total galaxy-galaxy lensing signal can be obtained with the following code. It first filters out all BOSS galaxies for which we couldn't find any source galaxy nearby. Then we divide it into jackknife samples that we will later use to estimate uncertainties. Finally, we stack the lensing signal in 4 different BOSS redshift bins and save the data.

Expand Down Expand Up @@ -113,9 +104,7 @@ We choose to include all the necessary corrections factors. The multiplicative s
result.write('kids_{}.csv'.format(lens_bin), overwrite=True)

****************
Acknowledgements
****************
----------------

When using the above data and algorithms, please to read and follow the acknowledgment section on the `KiDS DR4 release site <http://kids.strw.leidenuniv.nl/DR4/KiDS-1000_shearcatalogue.php#ack>`_.
Loading

0 comments on commit 9cd8559

Please sign in to comment.