Skip to content

Commit

Permalink
Merge pull request ESCOMP#2770 from samsrabin/docs-fixes-202409
Browse files Browse the repository at this point in the history
Various documentation fixes/improvements
  • Loading branch information
samsrabin authored Sep 26, 2024
2 parents 00b9f3a + edefa3c commit da386fe
Show file tree
Hide file tree
Showing 39 changed files with 355 additions and 772 deletions.
98 changes: 0 additions & 98 deletions doc/source/how-to-make-mesh.md

This file was deleted.

95 changes: 34 additions & 61 deletions doc/source/lilac/specific-atm-models/wrf-nesting.rst
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,9 @@
Using CTSM with WRF (Nested Model Runs)
========================================

This section includes instructions on how to run WRF coupled with CTSM for a
nested domain.
This section includes instructions on how to run WRF coupled with CTSM for a nested domain.

A nested domain is usually used to have a finer-resolution domain within the
coarser model domain. A nested simulation enables running at a higher
resolution over a smaller domain
A nested domain is usually used to have a finer-resolution domain within the coarser model domain. A nested simulation enables running at a higher resolution over a smaller domain.

.. note::
A nest should cover a portion of the parent domain and is fully contained by
Expand All @@ -37,9 +34,7 @@ There are currently two types of nesting available within WRF:
This example clarifies the workflow for running a nested WRF-CTSM case using one-way nesting with ``ndown.exe``.

The procedure for running a nested simulation for WRF with CTSM is
similar to the workflow for running WRF real cases, except that it requires
additional steps to (1) clone the CTSM repository, (2) build
CTSM and LILAC, and (3) define namelist options reuired for CTSM.
similar to the workflow for running WRF real cases, except that it requires additional steps to (1) clone the CTSM repository, (2) build CTSM and LILAC, and (3) define namelist options reuired for CTSM.

A full description of all steps for a WRF-CTSM run are included here.

Expand All @@ -50,36 +45,36 @@ A full description of all steps for a WRF-CTSM run are included here.
Therefore, we are not repeating the steps necessary for building WRF and
CTSM.

In this example we use a nested domain over the CONUS as shows below:
In this example we use a nested domain over the CONUS as shown below:

.. _Figure ctsm-ndown:

.. figure:: ndown_ctsm_diagram.svg

Flowchart for WRF-CTSM one-way nested simulations
.. todo::
Replace missing ndown_ctsm_diagram.svg

Flowchart for WRF-CTSM one-way nested simulations

Nested Simulations : Pre-processing (geogrid.exe)
-------------------------------------------------
In the WPS/ directory, edit `namelist.wps` for a nested simulation over your
desired domains. Make sure to change `max_dom=2`.
In the WPS/ directory, edit ``namelist.wps`` for a nested simulation over your
desired domains. Make sure to change ``max_dom=2``.

First, use geogrid.exe to define the domain and interpolate static geographical data
to the grids::
First, use geogrid.exe to define the domain and interpolate static geographical data to the grids::

./geogrid.exe >& log.geogrid

This step creates two files `geo_em.d01.nc` & `geo_em.d02.nc` which includes
the domain definition for each domain.
This step creates two files, ``geo_em.d01.nc`` and ``geo_em.d02.nc``, which include the domain definition for each domain.

If the geogrid step finishes successfully, you should see the following message in the log file::

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
! Successful completion of geogrid. !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

The basic difference here with a non-nested case is the namelist.wps should
have a column for each domain with `max_dom=2`. For example::
The basic difference here with a non-nested case is the namelist.wps should have a column for each domain with ``max_dom=2``. For example:

::
&share
wrf_core = 'ARW',
max_dom = 2,
Expand All @@ -101,18 +96,15 @@ Therefore ``geogrid.exe`` creates two files corresponding to each domain.

Nested Simulations : Pre-processing (ungrib.exe)
-------------------------------------------------
As mentioned previously, the purpose of the ungrib script is to unpack GRIB
meteorological data and pack it into an intermediate file format.
This step is exactly identical to a non-nested simulation.
As mentioned previously, the purpose of the ungrib script is to unpack GRIB meteorological data and pack it into an intermediate file format. This step is exactly identical to a non-nested simulation.

Run ungrib to get gribbed data into usable format to be ingested by WRF.

To run ungrib.exe, first link the GRIB data files that are going to be used::

./link_grib.csh $your_GRIB_data_path

Based on your GRIB data type, link or copy the appropriate VTable to your WPS directory.
WRF has some prepared VTable under ``/ungrib/Variable_tables/`` folder.
Based on your GRIB data type, link or copy the appropriate VTable to your WPS directory. WRF has some prepared VTable under ``/ungrib/Variable_tables/`` folder.

Extract meteorological fields from GRIB-formatted files::

Expand All @@ -128,47 +120,37 @@ At this point, you should see ungrib output (intermediate files) in your WPS dir

Nested Simulations : Pre-processing (metgrid.exe)
-------------------------------------------------
Ensure that the `start_date` and `end_date` for domain two is set correctly for
your simulation.
Next, run ``metgrid.exe``::
Ensure that the `start_date` and `end_date` for domain two is set correctly for your simulation. Next, run ``metgrid.exe``::

./metgrid.exe >& log.metgrid

Check the metgrid log for the following message showing successful completion of
metgrid step::
Check the metgrid log for the following message showing successful completion of metgrid step::

!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
! Successful completion of metgrid. !
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

Running metgrid for two domains will create files like
below::
Running metgrid for two domains will create files like below::

met_em.d01.*
met_em.d02.*

Nested Simulations : real.exe
------------------------------

In this step, run ``real.exe`` to generate initial and boundary conditions for
both domains.
In this step, run ``real.exe`` to generate initial and boundary conditions for both domains.

In summary, complete the following steps:

Move or link WPS output files (``met_em.d01*`` and ``met_em.d02`` files) to your WRF test directory.

Edit namelist.input for your WRF domain and desirable configurations.
This should be the same domain as WPS namelist. Make sure you set ``max_dom =
2,`` in the namelist.
Edit namelist.input for your WRF domain and desirable configurations. This should be the same domain as WPS namelist. Make sure you set ``max_dom = 2,`` in the namelist.

To run WRF-CTSM, in your namelist change land-surface option to 6 for both
domains::
To run WRF-CTSM, in your namelist change land-surface option to 6 for both domains::

sf_surface_physics = 6, 6,

Run real.exe (if compiled parallel submit a batch job) to generate
initail and boundary condition files for both domain.
Make sure the following three files have been created in your directory::
Run real.exe (if compiled parallel submit a batch job) to generate initial and boundary condition files for both domain. Make sure the following three files have been created in your directory::

wrfinput_d01
wrfinput_d02
Expand All @@ -178,6 +160,8 @@ The boundary condition file is only created for the outer domain.

Check the last line of the real log file for the following message:

.. todo:: What message?

Rename wrfinput_d02
-------------------
Next, rename the ``wrfinput_d02`` file to ``wrfndi_d02``::
Expand All @@ -186,8 +170,7 @@ Next, rename the ``wrfinput_d02`` file to ``wrfndi_d02``::

Run ndown.exe
-------------
In this step, we run ndown.exe to create initial and boundary condition for
domain 2 based on the domain 1 (outer domain).
In this step, we run ndown.exe to create initial and boundary condition for domain 2 based on the domain 1 (outer domain).

Add the following into your namelist.input file under ``&time_control``::

Expand All @@ -197,20 +180,15 @@ Run ndown.exe to create ``wrfinput_d02`` and ``wrfbdy_d02``.

Run WRF for coarser domain
---------------------------
In this step, run WRF for the outer domain.
Make sure that ``max_dom = 1`` to run only for the coarser domain.
In this step, run WRF for the outer domain. Make sure that ``max_dom = 1`` to run only for the coarser domain.

This step is exactly identical as the previous example and only creates the
``wrfout*`` files for the coarser domain.
This step is exactly identical as the previous example and only creates the ``wrfout*`` files for the coarser domain.

Please make sure to copy ``lnd_in`` , ``lilac_in``, and ``lnd_modelio`` for the
coarser domain in this directory.
Please make sure to copy ``lnd_in`` , ``lilac_in``, and ``lnd_modelio`` for the coarser domain in this directory.

Create CTSM runtime files for the fine domain
---------------------------------------------
This step is in addition creating CTSM runtime files for coarser domain which
was explained here. For succesfully completing the previous step you should
have already created these files for the coarser domain.
This step is in addition creating CTSM runtime files for coarser domain which was explained here. For succesfully completing the previous step you should have already created these files for the coarser domain.

.. seealso::

Expand All @@ -219,8 +197,7 @@ have already created these files for the coarser domain.
files for the finer domain you should follow the steps in section
:numref:`setting-ctsm-runtime-options`.

Again, the goal here is to create files that determine CTSM runtime options which
are defined within these three files:
Again, the goal here is to create files that determine CTSM runtime options which are defined within these three files:

- ``lnd_in``: This is the main namelist input file for CTSM inner domain

Expand All @@ -230,13 +207,9 @@ are defined within these three files:

Run WRF for the finer domain
-----------------------------
First, save (rename or move) the data from the coarser domain simulation
(``wrfout_d01_*`` files).
Next, rename ``wrfinput_d02`` and ``wrfbdy_d02`` to ``wrfinput_d01`` and ``wrfbdy_d01``, respectively.
First, save (rename or move) the data from the coarser domain simulation (``wrfout_d01_*`` files). Next, rename ``wrfinput_d02`` and ``wrfbdy_d02`` to ``wrfinput_d01`` and ``wrfbdy_d01``, respectively.

Edit namelist.input, moving all of the fine-grid domain data from column 2 to column 1
so that this run will be for the fine-grid domain only. Make sure you set
`max_dom=1` and set your `time_step` based on the finer domain.
Edit namelist.input, moving all of the fine-grid domain data from column 2 to column 1 so that this run will be for the fine-grid domain only. Make sure you set ``max_dom=1`` and set your ``time_step`` based on the finer domain.

.. note::
It may be beneficial to save namelist.input to something else prior to this step in case you need to repeat this
Expand Down
6 changes: 6 additions & 0 deletions doc/source/lilac/specific-atm-models/wrf-tools.rst
Original file line number Diff line number Diff line change
Expand Up @@ -48,10 +48,16 @@ is described in here.

../../../configure --macros-format Makefile --mpilib mpi-serial

.. todo::
Update the below, as domain files aren't needed with nuopc.

5. Generate CTSM domain files using ``get_domain`` tool::

./gen_domain -m /glade/work/$USER/ctsm/nldas_grid/scrip/wrf2clm_mapping_noneg.nc -o wrf2clm_ocn_noneg -l wrf2clm_lnd_noneg

.. todo::
Update the below, as ``mksurfdata.pl`` no longer exists.

6. Create surface datasets in ``tools/mksurfdata_esmf``::

./mksurfdata.pl -res usrspec -usr_gname "nldas" -usr_gdate "190124" -usr_mapdir "/glade/work/$USER/ctsm/nldas_grid/map" -y 2000 -exedir "/glade/u/home/$USER/src/ctsm/ctsm_surfdata/tools/mksurfdata_esmf" -no-crop
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@
Adding New Resolutions
========================

In the last chapter we gave the details on how to create new files for input into CLM. These files could be either global resolutions, regional-grids or even a single grid point. If you want to easily have these files available for continued use in your development you will then want to include them in the build-namelist database so that build-namelist can easily find them for you. You can deal with them, just by putting the settings in the ``user_nl_clm namelist`` file, or by using ``CLM_USRDAT_NAME``. Another way to deal with them is to enter them into the database for build-namelist, so that build-namelist can find them for you. This keeps one central database for all your files, rather than having multiple locations to keep track of files. If you have a LOT of files to keep track of it also might be easier than keeping track by hand, especially if you have to periodically update your files. If you just have a few quick experiments to try, for a short time period you might be best off using the other methods mentioned above.
In the last chapter we gave the details on how to create new files for input into CLM. These files could be either global resolutions, regional-grids or even a single grid point. If you want to easily have these files available for continued use in your development you will then want to include them in the build-namelist database so that build-namelist can easily find them for you. You can deal with them, just by putting the settings in the ``user_nl_clm`` namelist file, or by using ``CLM_USRDAT_NAME``. Another way to deal with them is to enter them into the database for build-namelist, so that build-namelist can find them for you. This keeps one central database for all your files, rather than having multiple locations to keep track of files. If you have a LOT of files to keep track of it also might be easier than keeping track by hand, especially if you have to periodically update your files. If you just have a few quick experiments to try, for a short time period you might be best off using the other methods mentioned above.

There are two parts to adding files to the build-namelist database. The first part is adding new resolution names which is done in the ``$CTSMROOT/bld/namelist_files/namelist_definition_clm4_5.xml`` file. You can then use the new resolution by using ``CLM_USRDAT_NAME``. If you also want to be able to give the resolution into **create_newcase** -- you'll need to add the grid to the ``$CIMEROOT/config/cesm/config_grid.xml`` file.
There are two parts to adding files to the build-namelist database. The first part is adding new resolution names which is done in the ``$CTSMROOT/bld/namelist_files/namelist_definition_ctsm.xml`` file. You can then use the new resolution by using ``CLM_USRDAT_NAME``. If you also want to be able to give the resolution to ``$CTSMROOT/cime/scripts/create_newcase`` -- you'll need to add the grid to the ``$CIMEROOT/config/cesm/config_grid.xml`` file.

The second part is actually adding the new filenames which is done in the ``$CTSMROOT/bld/namelist_files/namelist_defaults_clm4_5.xml`` file (``$CTSMROOT/bld/namelist_files/namelist_defaults_clm4_5_tools.xml`` file for CLM tools). If you aren't adding any new resolutions, and you are just changing the files for existing resolutions, you don't need to edit the namelist_definition file.
The second part is actually adding the new filenames which is done in the ``$CTSMROOT/bld/namelist_files/namelist_definition_ctsm.xml`` file. If you aren't adding any new resolutions, and you are just changing the files for existing resolutions, you don't need to edit the namelist_definition file.

Loading

0 comments on commit da386fe

Please sign in to comment.