diff --git a/README.md b/README.md index c1f37d88..a4c8bfe4 100644 --- a/README.md +++ b/README.md @@ -32,7 +32,7 @@ To cite a certain version, have a look at the [Zenodo site][10]. ## Install mHM can be compiled with cmake. See more details under [cmake manual][9]. -See also the [documentation][5] for detailed instructions to setup mHM. +See also the [documentation][5] for detailed instructions to set up mHM. ## Quick start diff --git a/doc/1-main.dox b/doc/1-main.dox index cdc74b31..ff262336 100644 --- a/doc/1-main.dox +++ b/doc/1-main.dox @@ -254,7 +254,7 @@ observed and calculated streamflows using three different time scales streamflow to downplay the effects of the peak flows over the low flows \cite HB2005 . These objective functions are denoted by \f$\phi_k,\, k=1,4\f$. Every objective function should be normalized -in the interval [0,1], with 1 representing the best posible solution. +in the interval [0,1], with 1 representing the best possible solution. The overall objective function to be minimized is then \f[ diff --git a/doc/2-get_started.dox b/doc/2-get_started.dox index 9f284b09..eb6aa530 100644 --- a/doc/2-get_started.dox +++ b/doc/2-get_started.dox @@ -12,11 +12,11 @@ The mHM distribution usually comes with two example test domains located in test_domain/ test_domain_2/ \endcode -The directory provides input data for the test basin and output examples. Detailled information about this test basin can be found in the chapter \ref testbasin. All parameters and paths are already set by default to the test case, so you can just start the simulation with the command +The directory provides input data for the test basin and output examples. Detailed information about this test basin can be found in the chapter \ref testbasin. All parameters and paths are already set by default to the test case, so you can just start the simulation with the command \code ./mhm \endcode -This will run mHM on two basins simultanously and create output files for discharge and interception in test/output_b*. +This will run mHM on two basins simultaneously and create output files for discharge and interception in test/output_b*. The chapter \ref output provides further information on visualising mHM results. \section owncatch Run your own Simulation @@ -45,12 +45,12 @@ each single setting, this section will only roughly describe the structure of th LAI values defined for 10 land classes or choose to run mHM using gridded LAI input data (e.g., MODIS). Gridded LAI data must be provided on a daily time-step at the Level-0 resolution. \li Process Switches: -* - Proccess case 5 - potential evapotranspiration (PET): +* - Process case 5 - potential evapotranspiration (PET): * -# PET is input (processCase(5)=0) * -# PET after Hargreaves-Samani (processCase(5)=1) * -# PET after Priestley-Taylor (processCase(5)=2) * -# PET after Penman-Monteith (processCase(5)=3) -* - Proccess case 8 - routing can be activated (=1) or deactivated (=0). +* - Process case 8 - routing can be activated (=1) or deactivated (=0). \li Annual Cycles: Values for pan evaporation hold in impervious regions only. The meteorological forcing table disaggregates daily input data to hourly values. \subsection mhmoutputnml Output Configuration: Time Steps, States, Fluxes @@ -62,7 +62,7 @@ because the file size will greatly increase with the number of containing variab \subsection mhmparameters Regionalised Parameters: Initial Values and Ranges The file mhm_parameters.nml contains all global parameters and their initial values. They have been determined by calibration in German basins -and seem to be transferabel to other catchments. If you come up with a very different catchment or routing resolution, these parameters should be recalibrated +and seem to be transferable to other catchments. If you come up with a very different catchment or routing resolution, these parameters should be recalibrated (see section \ref calibration). @@ -77,7 +77,7 @@ until the objective function converges to a confident best fit. mHM comes with four available optimization methods: \li MCMC: The Monte Carlo Markov Chain sampling of parameter sets is recommended for estimation of parameter uncertainties. Intermediate results are written to mcmc_tmp_parasets.nc . -\li DDS: The Dynamically Dimensioned Search is an optimization routine known improve the objective within a small number of iterations. However, the result of DDS is not neccessarily close to your global optimum. Intermediate results are written to dds_results.out . +\li DDS: The Dynamically Dimensioned Search is an optimization routine known improve the objective within a small number of iterations. However, the result of DDS is not necessarily close to your global optimum. Intermediate results are written to dds_results.out . \li Simulated Annealing: Simulated Annealing is a global optimization algorithm. SA is known to require a large number of iterations before convergence (e.g. 100 times more than DDS), but finds parameter sets closer to the global minimum like DDS. Intermediate results are written to anneal_results.out . \li SCE: The Shuffled Complex Evolution is a global optimization algorithm which is based on the shuffling of parameter complexes. It needs more iterations compared to the DDS (e.g. 20 times more), but less compared to Simulated Annealing. The increasing computational effort (i.e. iterations) leads to more reliable estimation of the global optimum compared to DDS. Intermediate results are written to sce_results.out . @@ -110,7 +110,7 @@ In mhm_parameters.nml you find the initial values and ranges from w Most ranges are very sensitive and have been determined by detailed sensitivity analysis, so we do not recommend to change them. With the FLAG column you may choose which parameters are allowed to be optimised. The parameter gain_loss_GWreservoir_karstic is not meant to be optimised. -Please mind that optimization runs will take long and may demand a huge amount of hardware ressources. We recommend to submit those jobs to a cluster computing system. +Please mind that optimization runs will take long and may demand a huge amount of hardware resources. We recommend to submit those jobs to a cluster computing system. \subsection optiafter Final Calibration Results @@ -132,6 +132,6 @@ There are two scripts that help you with that: As soon as the new parameters are set, deactivate the optimize switch in mhm.nml and rerun mHM once in order to obtain the final optimised output. -You should also have a look at the parameter evolution (e.g. sce_results.out) or final results. If any of the parameters stick to the very end of their allowed range, the result is not optimal and you will be in serious trouble. Possible reasons might be bad parameter ranges (even though they have been optimised mathematically, but in a basin or resoluion not comparable to yours) or bad input data. +You should also have a look at the parameter evolution (e.g. sce_results.out) or final results. If any of the parameters stick to the very end of their allowed range, the result is not optimal and you will be in serious trouble. Possible reasons might be bad parameter ranges (even though they have been optimised mathematically, but in a basin or resolution not comparable to yours) or bad input data. */ diff --git a/doc/3-data_preparation.dox b/doc/3-data_preparation.dox index 9236e65c..de37e88d 100644 --- a/doc/3-data_preparation.dox +++ b/doc/3-data_preparation.dox @@ -5,7 +5,7 @@ \section starting Getting Started -To run mHM the user requires a number of datasets. The follwing subsection gives +To run mHM the user requires a number of datasets. The following subsection gives a short overview and references to freely available datasets. \subsection meteo_data Meteorological variables @@ -40,13 +40,13 @@ meteorological variables may be need: Name | Unit | Temporal resolution | | -------------------------- | -------------------------- | -------------------------- | -0 - Potential evapotranspiration | \f$\mathrm{mm}\f$ | hourly to daily -1 - Minimum air temperature | \f$^\circ \mathrm{C}\f$ | daily -1 - Maximum air temperature | \f$^\circ \mathrm{C}\f$ | daily -2 - Net radiation | \f$ W\;m^{-2} \f$ | daily -3 - Net radiation | \f$ W\;m^{-2} \f$ | daily -3 - Absolut vapur pressure of air | \f$ Pa \f$ | daily -3 - Windspeed | \f$ m\;s^{-1} \f$ | daily +0 - Potential evapotranspiration | \f$\mathrm{mm}\f$ | hourly to daily +1 - Minimum air temperature | \f$^\circ \mathrm{C}\f$ | daily +1 - Maximum air temperature | \f$^\circ \mathrm{C}\f$ | daily +2 - Net radiation | \f$ W\;m^{-2} \f$ | daily +3 - Net radiation | \f$ W\;m^{-2} \f$ | daily +3 - Absolut vapour pressure of air | \f$ Pa \f$ | daily +3 - Windspeed | \f$ m\;s^{-1} \f$ | daily \subsection morph_data Morphological variables @@ -71,7 +71,7 @@ Streamflow location | \f$(\mathrm{m}, \mathrm{m})\f$ (lat,lon) \subsection luse_data Land Cover -Free land cover data is available from different sources. The Corine programm provides land cover scenes for Europe +Free land cover data is available from different sources. The Corine programme provides land cover scenes for Europe with resolutions of 100m and 250m (http://www.eea.europa.eu/publications/COR0-landcover), the Global Land Cover Map 2000 a dateset covers the entire world in a resolution of 1km. @@ -84,7 +84,7 @@ Leaf area index | - | weekly to monthly \subsection gauge_data Gauging Station Information -Streamflow measurments should also be available from federal authorities. In addition, the Global Runoff Data Center provides +Streamflow measurements should also be available from federal authorities. In addition, the Global Runoff Data Center provides timeseries of varying length for several thousand gauging stations all over the world (http://www.bafg.de/GRDC) @@ -129,7 +129,7 @@ Optionally, certain pixels can be extracted, too: ncks -d x,1,1 -d y,2,2 MAP.nc PIXEL.nc \endcode -\subsection extractncdata Extact Data to the Time Period of Interest +\subsection extractncdata Extract Data to the Time Period of Interest Let MAP.nc be the forcing data including your period of interest. Let DATE1 and DATE2 be the starting and ending dates of your period, respectively. Their date format is YYYY-MM-DD. The data PERIOD.nc can be extracted by the CDO command: @@ -169,7 +169,7 @@ For example, you can change the variable name TEMPERATURE in your NETCDF file wi Every meteorological data file should come with an additional file "header.txt" in the same directory. In the following example, we has only one pixel of data (ncols=nrows=1) and a cell size of 4km. -The easting and northing coordinates of the lower left cornaer should be similar to the morphological data of +The easting and northing coordinates of the lower left corner should be similar to the morphological data of your catchment. \code ncols 1 @@ -202,14 +202,14 @@ python [MHM_DIRECTORY]/pre-proc/create_latlon.py -h create_latlon.py needs several specifications via command line switches. First, the coordinate system of the morphological and meteorological data have to be specified according to www.spatialreference.org by using the switch: -c. -Second, you need to specify three header files containing the corresponding information for the different spatail -resolutions connected to mHM. The three resolutions are the resoltution of 1) the morphological input (switch: -f), +Second, you need to specify three header files containing the corresponding information for the different spatial +resolutions connected to mHM. The three resolutions are the resolution of 1) the morphological input (switch: -f), 2) the hydrological simulation (switch: -g), and 3) the routing (switch: -e). These header files can be produced by adopting the header file of the meteorological data Therefore, copy one of these files that you generated for meteo data (see \ref meteo): \code -cp [INPUT_DIRECTOY]/input/meteo/pre/header.txt [INPUT_DIRECTORY]/input/latlon/ +cp [INPUT_DIRECTORY]/input/meteo/pre/header.txt [INPUT_DIRECTORY]/input/latlon/ \endcode Edit the new file header.txt such that cellsize equals your hydrologic resolution. You have to adapt ncols and nrows @@ -231,7 +231,7 @@ we recommend to create different directories for each resolution you need, conta MHM needs several morphological input datasets. All these have to be provided as raster maps in the ArcGis ascii-format, which stores the above header and the actual data as plain text. Some of the raster files need to be complemented by look-up tables providing additional information. The tables and their structure are described in more detail in subsection \ref tables. - Take care of the follwing limitations to mHM input data during data processing: + Take care of the following limitations to mHM input data during data processing: \arg All gridded input, i.e. your morphological and your meteorological data, needs to cover the same spatial domain. That means, that the values xllcorner, yllcorner, xllcorner+ncols*cellsize and yllcorner+nrows*cellsize have to be identical for all files! \arg MHM allows you to provide your meteorological forcing in a different horizontal resolution than the morphological data. The larger cellsize however needs to be a multiple of @@ -242,7 +242,7 @@ The required datasets and their corresponding filenames: Description | Raster file name | Table file name | | -------------------------- | -------------------------- | -------------------------- | -Sink filled Digital Elevevation Model (DEM) | dem.asc | - +Sink filled Digital Elevation Model (DEM) | dem.asc | - Slope map | slope.asc | - Aspect Map | aspect.asc | - Flow Direction map | fdir.asc | - @@ -261,7 +261,7 @@ In the following paragraphs a possible GIS workflow is outlined using the softwa \subsection gnrl General considerations -\arg As the spatial discretizations (i.e. resolutions, origin) of your datasets will most likely differ, it is recommended to set the following envirnmental settings on every processing step outlined +\arg As the spatial discretizations (i.e. resolutions, origin) of your datasets will most likely differ, it is recommended to set the following environmental settings on every processing step outlined in the following paragraphs. \image html environments.png "Button 'Environments...' in all Toolbox windows" \anchor fig_environments \image latex environments.pdf "Button 'Environments...' in all Toolbox windows" width=14cm @@ -277,7 +277,7 @@ In the following paragraphs a possible GIS workflow is outlined using the softwa \anchor fig_project \image latex project.pdf "System Toolboxes -> Data Management Tools -> Projections and Transformations -> Raster -> Project Raster" width=14cm \arg If your input datasets are not already in the desired level-0 resolution, resample the DEM, the hydrogeological, LAI, soil and land use maps. - Choosing an appropiate resolution depends on data quality and needed level of simulation detail, but keep in mind that: + Choosing an appropriate resolution depends on data quality and needed level of simulation detail, but keep in mind that: 1. Your different input resolution levels must be multiples of each other. E.g. you should choose a level-0 resolution of 100m (instead of 90m in case you are using SRTM data) if your meteorological input resolution is 4km. @@ -288,12 +288,12 @@ In the following paragraphs a possible GIS workflow is outlined using the softwa \anchor fig_resample \image latex resample.pdf "System Toolboxes -> Data Management Tools -> Raster -> Raster Processing -> Resample" width=14cm \arg It is important that all your morphological input files exactly cover the same spatial domain. That also means that if a cell contains valid data in any one of the datasets, the very same cell must - also be definied in all the others. One possibility to solve this typical problem would be to set such 'doubtful' cells to the corresponding NODATA_value. Therefore create a mask as + also be defined in all the others. One possibility to solve this typical problem would be to set such 'doubtful' cells to the corresponding NODATA_value. Therefore create a mask as depicted below, which only contains cells, that are defined everywhere. \image html data_mask.png "System Toolboxes -> Spatial Analyst Tools -> Map Algebra -> Raster Calculator" \anchor fig_data_mask \image latex data_mask.pdf "System Toolboxes -> Spatial Analyst Tools -> Map Algebra -> Raster Calculator" width=14cm -\arg Mask all the mentioned datasets with the output of the 'Raster Calculator' follwing the procedure described in \ref mask of this tutorial. In case the described processing step is necessary, +\arg Mask all the mentioned datasets with the output of the 'Raster Calculator' following the procedure described in \ref mask of this tutorial. In case the described processing step is necessary, accomplish it before you reach subsection \ref hydro ! Masking these maps would most likely disturb the hydrological properties of your catchment data and result in unexpected model behaviour. @@ -314,13 +314,13 @@ In the following paragraphs a possible GIS workflow is outlined using the softwa \subsection hydro Flow direction and flow accumulation - Depending on quality and resolution of the DEM map, these steps can be done with the respective tools from the Spatial Anaylst Extension or by using the Arc Hydro Tools. + Depending on quality and resolution of the DEM map, these steps can be done with the respective tools from the Spatial Analyst Extension or by using the Arc Hydro Tools. \subsubsection spatial_analyst Spatial Analyst If a high quality DEM, with a resolution fine enough to represent small scale river morphology is available, you may calculate flow direction and flow accumulation directly. -\arg Flow Directon +\arg Flow Direction \image html fdir.png "System Toolboxes -> Spatial Analyst Tools -> Hydrology -> Flow Direction" \anchor fig_fdir \image latex fdir.pdf "System Toolboxes -> Spatial Analyst Tools -> Hydrology -> Flow Direction" width=14cm @@ -352,8 +352,8 @@ In case the necessary stream network file is not available, you can get one from \subsection gauges Gauges map - \arg Assuming that you have the positions of your gauges in a table, which has at least the colums x, y, id (in any order and/or amongst other columns), you are able to - to convert your data into a Point Shapefile. Choose the appropiate fields and do not forget to set the coordinate system information. + \arg Assuming that you have the positions of your gauges in a table, which has at least the columns x, y, id (in any order and/or amongst other columns), you are able to + to convert your data into a Point Shapefile. Choose the appropriate fields and do not forget to set the coordinate system information. \image html xy_gauges.png "Right click on the table in Arc Catalog - > Create Feature Class -> From XY Table" \anchor fig_xy_gauges \image latex xy_gauges.pdf "Right click on the table in Arc Catalog - > Create Feature Class -> From XY Table" width=8cm @@ -414,7 +414,7 @@ All these look-up tables specify the total number of classes/units to read in th following the keywords 'nSoil_Types', 'nGeo_Formations' and 'NoLAIclasses' respectively. The second line acts as a header describing the contents of the following data. In the subsection \ref soil_table \ref hydrogeo_table and \ref lai_table screenshots of shorted, but sufficient table -files are presented. All fields are further listed and described in the repective tables. +files are presented. All fields are further listed and described in the respective tables. \subsection soil_table The soil look-up table @@ -482,7 +482,7 @@ Line | Description For every gauge id given in 'idgauges.asc' one gauge file has to be created as outlined above. The file name has to exactly reflect the gauge id and carry a '.txt' extension. The table data file for a gauge with an id of 0343 in 'idgauges.asc' should -therfore be named '0343.txt'. +therefore be named '0343.txt'. \section post-gis Post-GIS preparation @@ -520,7 +520,7 @@ The script takes three Input arguments: 2. The target grid which could also be an header file as described in subsection \ref headerfiles 3. An output file -The program will fail, if your target grid is smaller than the source grid or if the cellsizes of both grids are not divisable. +The program will fail, if your target grid is smaller than the source grid or if the cellsizes of both grids are not divisible. If necessary the source grid will be shifted in order to make both origins match. This 'shift' is only accomplished by changing the values of the corner coordinates, no real interpolation will be done. diff --git a/doc/4-visualise_out.dox b/doc/4-visualise_out.dox index 1280fc0d..a6bae9a8 100644 --- a/doc/4-visualise_out.dox +++ b/doc/4-visualise_out.dox @@ -9,7 +9,7 @@ mHM usually writes two files into the output directory specified in mhm.nml: \code ConfigFile.log daily_discharge.out - dischage.nc + discharge.nc mHM_Fluxes_States.nc mRM_Fluxes_States.nc \endcode @@ -35,7 +35,7 @@ In order to hide unnecessary output messages, you may pipe them to a log file: ncview FILE.nc 1> ~/log/ncview.out 2> ~/log/ncview.err \endcode If you want to transfer large nc files through servers or visualise them locally, it might be useful to compress the data before. -The featured nc file deflation with the ncks module will decrease the file size usually by 30-60%. Choose the deflation level from minimum (0) to maximum (9). The -4 switch in the following command will convert netcdf3 files to version 4 simultanously. +The featured nc file deflation with the ncks module will decrease the file size usually by 30-60%. Choose the deflation level from minimum (0) to maximum (9). The -4 switch in the following command will convert netcdf3 files to version 4 simultaneously. \code ncks -4 -L 9 IN.nc OUT.nc \endcode @@ -57,7 +57,7 @@ In some cases it has proven to be necessary to add the following line to /etc/pr \section restartview Exploring the contents of the Restarting files -Restarting files are simple binary dumps of arrays and vectors of all constants, parameters, state variables (1D, 2D) and fluxes at a given point in time of a simulation that are needed for executing the subsequent mHM time step in a new instance of this model without performing spin-out simulations and additional run time up to the previos time point. The stored information is divided into three categories: 1) Configuration variables at L0 and L1 levels (xxx_config.nc), 2) configuration variables at L11 level (i.e., routing) (xxx_L11_config.nc), and 3) and effective parameters, state variables and fluxes at L1 and L11 levels (xxx_states.nc). +Restarting files are simple binary dumps of arrays and vectors of all constants, parameters, state variables (1D, 2D) and fluxes at a given point in time of a simulation that are needed for executing the subsequent mHM time step in a new instance of this model without performing spin-out simulations and additional run time up to the previous time point. The stored information is divided into three categories: 1) Configuration variables at L0 and L1 levels (xxx_config.nc), 2) configuration variables at L11 level (i.e., routing) (xxx_L11_config.nc), and 3) and effective parameters, state variables and fluxes at L1 and L11 levels (xxx_states.nc). \section warnview WARNINGS diff --git a/doc/5-calibration.dox b/doc/5-calibration.dox index 89379faf..491de5dd 100644 --- a/doc/5-calibration.dox +++ b/doc/5-calibration.dox @@ -19,7 +19,7 @@ not to change the bounds, since they are the result of intensive sensitivity stu Different optimization methods are available to find the best configuration of parameters for the selected objective function. You may chose between Simulated Annealing (SA), Dynamically Dimensioned Search (DDS) and Shuffled Complex -Evolution algorith (SCE). Details about these methods can be found in the module describtion part of this manual. At the +Evolution algorithm (SCE). Details about these methods can be found in the module description part of this manual. At the very end of mhm.nml additional settings are offered for optimization. \subsection Objective Functions @@ -47,7 +47,7 @@ modified with tools like cdo or python. \subsection Output Calibration runs result in a parameter file called FinalParams.nml which contains the optimized parameter -set. Replace mhm_parameters.nml with FinalParams.nml, then run mHM in foreward mode in order +set. Replace mhm_parameters.nml with FinalParams.nml, then run mHM in forward mode in order to obtain hydrologic predictions. */ \ No newline at end of file diff --git a/doc/INSTALL.md b/doc/INSTALL.md index 8963e7b9..4478dbf3 100644 --- a/doc/INSTALL.md +++ b/doc/INSTALL.md @@ -62,7 +62,7 @@ Easiest way to do so is: 3. Open Ubuntu from the new entry in the start menu -Then you can follow the install instructions for Ubuntu from above. +Then you can follow the installation instructions for Ubuntu from above. If you rather want to use [Cygwin](https://www.cygwin.com/) (tool providing Linux functionality on Windows), step-by-step guidelines on @@ -195,7 +195,7 @@ cmake --install release --prefix $CONDA_PREFIX Starting with version 5.12, mHM is depending on [FORCES](https://git.ufz.de/chs/forces/), our Fortran library for Computational Environmental Systems. This library is downloaded on the fly by [CPM](https://github.com/cpm-cmake/CPM.cmake), the cmake package manager. -If you don't want to download it indirectly, know you wont have internet during your development or you want to work on routines provided by FORCES, you can place a copy of the FORCES repository in the root of your cloned mHM repository by e.g.: +If you don't want to download it indirectly, know you won't have internet during your development or you want to work on routines provided by FORCES, you can place a copy of the FORCES repository in the root of your cloned mHM repository by e.g.: ```bash git clone https://git.ufz.de/chs/forces.git ```