-
Notifications
You must be signed in to change notification settings - Fork 1
Home
In my stripped down version it all runs fine. Copying and modifying Dave's ZPS configuration, things start to break down (always at ts=25). A quick difference of my namelist_cfg with Dave's shows a few candidates. A summary of a few sensitivity tests wrt the namelist_cfg:
- nn_tra_dta = 0 (fails ts=25; original value 1)
- ln_isf = false (fails ts=25; original value true)
- rn_rdt = 240 (fails ts=31; original value 300)
- nn_dyn3d_dta = 0 (fails ts=25; original value 1)
- ln_vol = false (fails ts=25; original value true)
- ln_bdy = false (fails ts=25; original value true)
I've also switched off runoff and geothermal heating switch, but it makes no difference either.
Switching out the initial conditions [which incidentally had different names, but appear to be the same data], with no success, I turned my attention to the cpp keys. By eliminating keys one by one I've found that key_trabbl
appears to be the one that is causing the SZT configuration to crash.
Solution: you can still compile with key_trabbl, just set the namelist parameters accordingly (i.e. for SZT set bbl diffusion and advection to 0. This way the same exe can be used for ZPS and SZT configurations). i.e.
!-----------------------------------------------------------------------
&nambbl ! bottom boundary layer scheme
!-----------------------------------------------------------------------
nn_bbl_ldf = 0 ! diffusive bbl (=1) or not (=0)
nn_bbl_adv = 0 ! advective bbl (=1/2) or not (=0)
Why should trabbl
be an issue in the SZT configuration?
Reading through the namelists the following choices warrant further investigation/understanding:
-
ln_dm2dc = .false. in DM's JRA forced simulations. Was this the case in the CORE2 ones?
- DRM : ln_dm2dc = .false. for JRA because the forcing fields are specified every 3 hours. Setting .true. detects as an error. I used ln_dm2dc = .true. with CORE2.
-
ln_full_vel = .true. in DM's simulations. Presumably this is the full velocities being supplied at the northern boundaries? It is currently set to .false. in this repository, but .true. would seem a better choice unless there are sensible reasons otherwise.
-
ln_full_vel was switched off as it was causing the SZT model to blow up on the boundary. The purpose of this switch is to decompose the BT and BC velocities from the full 3D BDY velocity file. As a Neumann condition was being used for the BC velocities I've just switch ln_full_vel = .false. so only the 2D barotropic BDY files are read in.
-
DRM : This sounds like its functionally the same? Once I found a combination of BDY options that worked I stuck with it!
-
-
nn_bbl_ldf and nn_bbl_adv < see above, N006 used ...ldf = 1 and ...adv = 0
-
ln_traldf_iso switched to ln_traldf_hor |
ln_dynldf_hor switched to ln_dynldf_lev | Are these required changes for the SZT? ln_dynldf_hor is not coded for SZT -
rn_aht_0 = 1500. ? This seems an extremely high value, given that eORCA1 simulations typically use 1000. The N006 global ORCA12 simulation used rn_aht_0 = 125.
- DRM : For the particular revision the viscosities/diffusivities are set as if the model has a grid spacing 1 degree at the Equator. The value of 1500 gets scaled down to 125 in ldf_tra_init of ldftra.F90. For viscosity this takes place in ldf_dyn_init of ldfdyn.F90. The specified time-space variation (nn_aht_ijk_t/nn_ahm_ijk_t=20) results in a call to ldf_c2d in ldfc1d_c2d.F90, which sets the latitudinal variation.
-
Check with DM what the benefits of the MUSCL scheme are - should this be switched for SZT?
- DRM : I looked through all the available schemes with an eye to which ones I felt would give the least spurious diapycnal mixing. Its hard to do that without an explicit test for the particular problem...I felt that MUSCL combined the properties that I was after the best; its upwind-biased, so less chance of under-/over-shoots (which seem to be worse than just having a slightly diffusive scheme), and has some knowledge/estimate of subgrid-scale structure. Its probably not as good as piecewise parabolic (I think its essentially the 1st order version of it), Prather or Daru & Tenaud's 7th order scheme. At the time there wasn't a lot of information on the compact 4th order scheme. Based on some MITgcm experiments I did a few years ago, higher order centred schemes don't make out as well as higher order upwind-biased ones, though. I got the worst results with low order flux limited schemes, which are probably picking the worst aspects of both dispersive and diffusive schemes and combining them.
Also check with DM about how to handle leap years in the simulations so that e.g. the 5-day means remain in sync and also clarify the method for the spin-up of the ice
-
DRM : To handle leap years I came up with an inelegant but workable set of pbs scripts that I could generate via matlab (as well as the namelists with the right timesteps in) and keep the post-processing automated. It got more complicated than I liked because of keeping the 1m means in sync, as well as the 5d ones. Each leap year is run in 4 segments: 01/01-29/02, 26/02-31/03, 01/03-31/03, and 01/04-31/12. The 5d means come from the 1st, 2nd and 4th segment. There are 74 5d means, instead of 73, with two that overlap by 4 days. One from 25/02-29/02 and the other from 26/02-01/03. I then average these together to produce something representative of this period (even though its got some duplicate information from 01/03). The 1m means come from the 1st, 3rd and 4th segment, with the correct start and end dates. The 1d/1m means from the 2nd segment, and the 1d/5d means from the 3rd segment, are duplicates and/or out-of-sync and don't get post-processed (although I kept hold of them).
-
DRM : For the ice spinup I started with no ice (at the beginning of the CORE2 spinup, the JRA55 starts from the end of the CORE2, including the ice). Ran for a year and then restarted, but replaced the temperature and salinity field from the restart file with the initial condition (I found that NEMO automatically read in the initial condition still, which was handy). All the other fields were kept the same as the restart, although I think its the velocity fields that were probably most important. This let the eddy field keep building whilst preventing some excess salinity build up under the ice. I did this twice, until the end of the 3rd model year, and then switched executable to just run normally from the restart files.