diff --git a/data_override/README.MD b/data_override/README.MD index f9e19464a..b35879edf 100644 --- a/data_override/README.MD +++ b/data_override/README.MD @@ -7,29 +7,29 @@ - [How to use it?](README.MD#2-how-to-use-it) - [Converting legacy data_table to data_table.yaml](README.MD#3-converting-legacy-data_table-to-data_tableyaml) - [Examples](README.MD#4-examples) +- [External Weight File Structure](README.MD#5-external-weight-file-structure) #### 1. YAML Data Table format: Each entry in the data_table has the following key values: -- **gridname:** Name of the grid to interpolate the data to. The acceptable values are "ICE", "OCN", "ATM", and "LND" -- **fieldname_code:** Name of the field as it is in the code to interpolate. -- **fieldname_file:** Name of the field as it is writen in the file. **Required** only if overriding from a file -- **file_name:** Name of the file where the variable is located, including the directory. **Required** only if overriding from a file -- **interpol_method:** Method used to interpolate the field. The acceptable values are "bilinear", "bicubic", and "none". "none" implies that the field in the file is already in the model grid. The LIMA format is no longer supported. **Required** only if overriding from a file +- **grid_name:** Name of the grid to interpolate the data to. The acceptable values are "ICE", "OCN", "ATM", and "LND" +- **fieldname_in_model:** Name of the field as it is in the code to interpolate. +- **override_file:** Optional subsection with key/value pairs defining how to override from a netcdf file. + - **file_name:** Name of the file where the variable is located, including the directory + - **fieldname_in_file:** Name of the field as it is writen in the file + - **interp_method:** Method used to interpolate the field. The acceptable values are "bilinear", "bicubic", and "none". "none" implies that the field in the file is already in the model grid. The LIMA format is no longer supported + - **multi_file:** Optional subsection with key/value pairs to use multiple(3) input netcdf files instead of 1. Note that **file_name** must be the second file in the set when using multiple input netcdf files + - **prev_file_name:** The name of the first file in the set + - **next_file_name:** The name of the third file in the set + - **external_weights:** Optional subsection with key/value pairs defining the external weights file to used for the interpolation. + - **file_name:** Name of the file where the external weights are located, including the directory + - **source:** Name of the source that generated the external weights. The only acceptable value is "fregrid" - **factor:** Factor that will be multiplied after the data is interpolated - -If it is desired to interpolate the data to a region of the model grid. The following **optional** arguments are available. -- **region_type:** The region type. The acceptable values are "inside_region" and "outside_region" -- **lon_start:** The starting latitude in the same units as the grid data in the file -- **lon_end:** The ending latitude in the same units as the grid data in the file -- **lat_start:** The starting longitude in the same units as the grid data in the file -- **lon_end:** The ending longitude in the same units as the grid data in the file - -If it is desired to use multiple(3) input netcdf files instead of 1. The following **optional** keys are available. -- **is_multi_file:** Set to `True` is using the multi-file feature -- **prev_file_name:** The name of the first file in the set -- **next_file_name:** The name of the third file in the set - -Note that **file_name** must be the second file in the set. **prev_file_name** and/or **next_file_name** are required if **is_multi_file** is set to `True` +- **subregion:** Optional subsection with key/value pairs that define a subregion of the model grid to interpolate the data to. + - **type:** The region type. The acceptable values are "inside_region" and "outside_region" + - **lon_start:** The starting latitude in the same units as the grid data in the file + - **lon_end:** The ending latitude in the same units as the grid data in the file + - **lat_start:** The starting longitude in the same units as the grid data in the file + - **lon_end:** The ending longitude in the same units as the grid data in the file #### 2. How to use it? In order to use the yaml data format, [libyaml](https://github.com/yaml/libyaml) needs to be installed and linked with FMS. Additionally, FMS must be compiled with -Duse_yaml macro. If using autotools, you can add `--with-yaml`, which will add the macro for you and check that libyaml is linked correctly. @@ -55,21 +55,22 @@ In the **legacy format**, the data_table will look like: In the **yaml format**, the data_table will look like ``` data_table: - - gridname : ICE - fieldname_code : sic_obs - fieldname_file : sic - file_name : INPUT/hadisst_ice.data.nc - interpol_method : bilinear - factor : 0.01 + - grid_name : ICE + fieldname_in_model : sic_obs + override_file: + - file_name : INPUT/hadisst_ice.data.nc + fieldname_in_file : sic + interp_method : bilinear + factor : 0.01 ``` Which corresponds to the following model code: ```F90 call data_override('ICE', 'sic_obs', icec, Spec_Time) ``` where: -- `ICE` corresponds to the gridname in the data_table -- `sic_obs` corresponds to the fieldname_code in the data_table -- `icec` is the variable to write the data to +- `ICE` is the component domain for which the variable is being interpolated and corresponds to the grid_name in the data_table +- `sic_obs` corresponds to the fieldname_in_model in the data_table +- `icec` is the storage array that holds the interpolated data - `Spec_Time` is the time to interpolate the data to. Additionally, it is required to call data_override_init (in this case with the ICE domain). The grid_spec.nc file must also contain the coordinate information for the domain being used. @@ -82,15 +83,15 @@ call data_override_init(Ice_domain_in=Ice_domain) In the **legacy format**, the data_table will look like: ``` -"ICE", "sit_obs", "", "INPUT/hadisst_ice.data.nc", "none", 2.0 +"ICE", "sit_obs", "", "INPUT/hadisst_ice.data.nc", "none", 2.0 ``` In the **yaml format**, the data_table will look like: -``` +``` yaml data_table: - - gridname : ICE - fieldname_code : sit_obs - factor : 0.01 + - grid_name : ICE + fieldname_in_model : sit_obs + factor : 0.01 ``` Which corresponds to the following model code: @@ -98,9 +99,9 @@ Which corresponds to the following model code: call data_override('ICE', 'sit_obs', icec, Spec_Time) ``` where: -- `ICE` corresponds to the gridname in the data_table -- `sit_obs` corresponds to the fieldname_code in the data_table -- `icec` is the variable to write the data to +- `ICE` is the component domain for which the variable is being interpolated and corresponds to the grid_name in the data_table +- `sit_obs` corresponds to the fieldname_in_model in the data_table +- `icec` is the storage array that holds the interpolated data - `Spec_Time` is the time to interpolate the data to. Additionally, it is required to call data_override_init (in this case with the ICE domain). The grid_spec.nc file is still required to initialize data_override with the ICE domain. @@ -117,14 +118,15 @@ In the **legacy format**, the data_table will look like: ``` In the **yaml format**, the data_table will look like: -``` +``` yaml data_table: - - gridname : OCN - fieldname_code : runoff - fieldname_file : runoff - file_name : INPUT/runoff.daitren.clim.nc - interpol_method : none - factor : 1.0 + - grid_name : OCN + fieldname_in_model : runoff + override_file: + - file_name : INPUT/runoff.daitren.clim.nc + fieldname_in_file : runoff + interp_method : none + factor : 1.0 ``` Which corresponds to the following model code: @@ -132,9 +134,9 @@ Which corresponds to the following model code: call data_override('OCN', 'runoff', runoff_data, Spec_Time) ``` where: -- `OCN` corresponds to the gridname in the data_table -- `runoff` corresponds to the fieldname_code in the data_table -- `runoff_data` is the variable to write the data to +- `OCN` is the component domain for which the variable is being interpolated and corresponds to the grid_name in the data_table +- `runoff` corresponds to the fieldname_in_model in the data_table +- `runoff_data` is the storage array that holds the interpolated data - `Spec_Time` is the time to interpolate the data to. Additionally, it is required to call data_override_init (in this case with the ocean domain). The grid_spec.nc file is still required to initialize data_override with the ocean domain and to determine if the data in the file is in the same grid as the ocean. @@ -142,3 +144,59 @@ Additionally, it is required to call data_override_init (in this case with the o ```F90 call data_override_init(Ocn_domain_in=Ocn_domain) ``` + +**4.4** The following example uses the multi-file capability +``` yaml +data_table: + - grid_name : ICE + fieldname_in_model : sic_obs + override_file: + - file_name : INPUT/hadisst_ice.data_yr1.nc + fieldname_in_file : sic + interp_method : bilinear + multi_file: + - next_file_name: INPUT/hadisst_ice.data_yr2.nc + prev_file_name: INPUT/hadisst_ice.data_yr0.nc + factor : 0.01 +``` +Data override determines which file to use depending on the model time. This is to prevent having to combine the 3 yearly files into one, since the end of the previous file and the beginning of the next file are needed for yearly simulations. + +**4.5** The following example uses the external weight file capability +``` yaml +data_table: + - grid_name : ICE + fieldname_in_model : sic_obs + override_file: + - file_name : INPUT/hadisst_ice.data.nc + fieldname_in_file : sic + interp_method : bilinear + external_weights: + - file_name: INPUT/remamp_file.nc + source: fregrid + factor : 0.01 +``` + +#### 5. External Weight File Structure + +**5.1** Bilinear weight file example from fregrid + +``` +dimensions: + nlon = 5 ; + nlat = 6 ; + three = 3 ; + four = 4 ; +variables: + int index(three, nlat, nlon) ; + double weight(four, nlat, nlon) ; +``` +- `nlon` and `nlat` must be equal to the size of the global domain. +- `index(1,:,:)` corresponds to the index (i) of the longitudes point in the data file, closest to each model lon, lat +- `index(2,:,:)` corresponds to the index (j) of the lattidude point in the data file, closest to each model lon, lat +- `index(3,:,:)` corresponds to the tile (it should be 1 since data_override does not support interpolation **from** cubesphere grids) + - From there the four corners are (i,j), (i,j+1) (i+1) (i+1,j+1) +- The weights for the four corners + - weight(:,:,1) -> (i,j) + - weight(:,:,2) -> (i,j+1) + - weight(:,:,3) -> (i+1,j) + - weight(:,:,4) -> (i+1,j+1) diff --git a/data_override/include/data_override.inc b/data_override/include/data_override.inc index 84c22e952..6de2f41b6 100644 --- a/data_override/include/data_override.inc +++ b/data_override/include/data_override.inc @@ -27,7 +27,7 @@ use constants_mod, only: DEG_TO_RAD use mpp_mod, only : mpp_error, FATAL, WARNING, NOTE, stdout, stdlog, mpp_max use mpp_mod, only : input_nml_file use horiz_interp_mod, only : horiz_interp_init, horiz_interp_new, horiz_interp_type, & - assignment(=) + horiz_interp_read_weights use time_interp_external2_mod, only: time_interp_external_init, & time_interp_external, & time_interp_external_bridge, get_time_axis, & @@ -63,6 +63,9 @@ type data_type character(len=128) :: fieldname_file !< fieldname used in the netcdf data file character(len=512) :: file_name !< name of netCDF data file character(len=128) :: interpol_method !< interpolation method (default "bilinear") + logical :: ext_weights + character(len=128) :: ext_weights_file_name + character(len=128) :: ext_weights_source real(FMS_DATA_OVERRIDE_KIND_) :: factor !< For unit conversion, default=1, see OVERVIEW above real(FMS_DATA_OVERRIDE_KIND_) :: lon_start, lon_end, lat_start, lat_end integer :: region_type @@ -94,10 +97,21 @@ type override_type integer :: is_src, ie_src, js_src, je_src end type override_type +!> Private type for holding horiz_interp_type for a weight file +!! This is needed so that if variables use the same weight file, +!! then we won't have to read the weight file again +!> @ingroup data_override_mod +type fmsExternalWeights_type + character(len=:), allocatable :: weight_filename !< Name of the weight file + type(horiz_interp_type) :: horiz_interp !< Horiz interp type read in from the weight file +end type fmsExternalWeights_type + integer, parameter :: lkind = FMS_DATA_OVERRIDE_KIND_ integer, parameter :: max_table=100, max_array=100 integer :: table_size !< actual size of data table +integer :: nweight_files !< Number of weight files that have been used +type(fmsExternalWeights_type), allocatable, target :: external_weights(:) !< External weights types logical :: module_is_initialized = .FALSE. type(domain2D) :: ocn_domain,atm_domain,lnd_domain, ice_domain @@ -217,6 +231,8 @@ end if if (file_exists("data_table")) & call mpp_error(FATAL, "You cannot have the legacy data_table if use_data_table_yaml=.true.") call read_table_yaml(data_table) + allocate(external_weights(table_size)) + nweight_files = 0 else if (file_exists("data_table.yaml"))& call mpp_error(FATAL, "You cannot have the yaml data_table if use_data_table_yaml=.false.") @@ -546,6 +562,7 @@ subroutine read_table(data_table) data_entry%lat_end = -1.0_lkind data_entry%region_type = NO_REGION endif + data_entry%ext_weights = .false. data_table(ntable) = data_entry enddo call mpp_error(FATAL,'too many enries in data_table') @@ -564,7 +581,8 @@ subroutine read_table_yaml(data_table) type(data_type), dimension(:), allocatable, intent(out) :: data_table !< Contents of the data_table.yaml integer, allocatable :: entry_id(:) - integer :: nentries + integer :: sub_block_id(1), sub2_block_id(1) + integer :: nentries, mentries integer :: i character(len=50) :: buffer integer :: file_id @@ -579,53 +597,90 @@ subroutine read_table_yaml(data_table) call get_block_ids(file_id, "data_table", entry_id) do i = 1, nentries - call get_value_from_key(file_id, entry_id(i), "gridname", data_table(i)%gridname) - call check_for_valid_gridname(data_table(i)%gridname) - call get_value_from_key(file_id, entry_id(i), "fieldname_code", data_table(i)%fieldname_code) - - data_table(i)%fieldname_file = "" - call get_value_from_key(file_id, entry_id(i), "fieldname_file", data_table(i)%fieldname_file, & - & is_optional=.true.) - - data_table(i)%multifile = .false. - call get_value_from_key(file_id, entry_id(i), "is_multi_file", data_table(i)%multifile, & - & is_optional=.true.) - - if (data_table(i)%multifile) then - data_table(i)%prev_file_name = "" - data_table(i)%next_file_name = "" - call get_value_from_key(file_id, entry_id(i), "prev_file_name", data_table(i)%prev_file_name, & - & is_optional=.true.) - call get_value_from_key(file_id, entry_id(i), "next_file_name", data_table(i)%next_file_name, & - & is_optional=.true.) + call get_value_from_key(file_id, entry_id(i), "factor", data_table(i)%factor) + call get_value_from_key(file_id, entry_id(i), "grid_name", data_table(i)%gridname) + call check_for_valid_gridname(data_table(i)%gridname) + call get_value_from_key(file_id, entry_id(i), "fieldname_in_model", data_table(i)%fieldname_code) + + mentries = get_num_blocks(file_id, "override_file", parent_block_id=entry_id(i)) + data_table(i)%file_name = "" + data_table(i)%fieldname_file = "" + data_table(i)%interpol_method = "none" + data_table(i)%multifile = .false. + data_table(i)%ext_weights = .false. + data_table(i)%region_type = NO_REGION + data_table(i)%prev_file_name = "" + data_table(i)%next_file_name = "" + data_table(i)%ext_weights_file_name = "" + data_table(i)%ext_weights_source = "" + + ! If there is no override_file block, then not overriding from file, so move on to the next entry + if (mentries .eq. 0) cycle + + if(mentries.gt.1) call mpp_error(FATAL, "Too many override_file blocks in data table. "//& + "Check your data_table.yaml entry for field:"//trim(data_table(i)%gridname)//":"//& + trim(data_table(i)%fieldname_code)) + call get_block_ids(file_id, "override_file", sub_block_id, parent_block_id=entry_id(i)) + + call get_value_from_key(file_id, sub_block_id(1), "file_name", data_table(i)%file_name) + call get_value_from_key(file_id, sub_block_id(1), "fieldname_in_file", data_table(i)%fieldname_file) + call get_value_from_key(file_id, sub_block_id(1), "interp_method", data_table(i)%interpol_method) + call check_interpol_method(data_table(i)%interpol_method, data_table(i)%file_name, & + & data_table(i)%fieldname_file) + + mentries = get_num_blocks(file_id, "multi_file", parent_block_id=sub_block_id(1)) + if(mentries.gt.1) call mpp_error(FATAL, "Too many multi_file blocks in tata table. "//& + "Check your data_table.yaml entry for field:"//trim(data_table(i)%gridname)//":"//& + trim(data_table(i)%fieldname_code)) + + if(mentries.gt.0) data_table(i)%multifile = .true. + + if (data_table(i)%multifile) then + call get_block_ids(file_id, "multi_file", sub2_block_id, parent_block_id=sub_block_id(1)) + call get_value_from_key(file_id, sub2_block_id(1), "prev_file_name", data_table(i)%prev_file_name) + call get_value_from_key(file_id, sub2_block_id(1), "next_file_name", data_table(i)%next_file_name) if (trim(data_table(i)%prev_file_name) .eq. "" .and. trim(data_table(i)%next_file_name) .eq. "") & call mpp_error(FATAL, "The prev_file_name and next_file_name must be present if is_multi_file. "//& "Check your data_table.yaml entry for field:"//trim(data_table(i)%gridname)//":"//& trim(data_table(i)%fieldname_code)) - endif + endif + + mentries = get_num_blocks(file_id, "external_weights", parent_block_id=sub_block_id(1)) + if(mentries.gt.1) call mpp_error(FATAL, "Too many external_weight blocks in data table. "//& + "Check your data_table.yaml entry for field:"//trim(data_table(i)%gridname)//":"//& + trim(data_table(i)%fieldname_code)) + + if(mentries.gt.0) data_table(i)%ext_weights = .true. - data_table(i)%file_name = "" - call get_value_from_key(file_id, entry_id(i), "file_name", data_table(i)%file_name, & - & is_optional=.true.) + if (data_table(i)%ext_weights) then + call get_block_ids(file_id, "external_weights", sub2_block_id, parent_block_id=sub_block_id(1)) + call get_value_from_key(file_id, sub2_block_id(1), "file_name", data_table(i)%ext_weights_file_name) + call get_value_from_key(file_id, sub2_block_id(1), "source", data_table(i)%ext_weights_source) + if (trim(data_table(i)%ext_weights_file_name) .eq. "" .and. trim(data_table(i)%ext_weights_source) .eq. "") & + call mpp_error(FATAL, "The file_name and source must be present when using external weights"//& + "Check your data_table.yaml entry for field:"//trim(data_table(i)%gridname)//":"//& + trim(data_table(i)%fieldname_code)) + endif - data_table(i)%interpol_method = "none" - call get_value_from_key(file_id, entry_id(i), "interpol_method", data_table(i)%interpol_method, & - & is_optional=.true.) - call check_interpol_method(data_table(i)%interpol_method, data_table(i)%file_name, & - data_table(i)%fieldname_file) + mentries = get_num_blocks(file_id, "subregion", parent_block_id=entry_id(i)) + if(mentries.gt.1) call mpp_error(FATAL, "Too many subregion blocks in data table. "//& + "Check your data_table.yaml entry for field:"//trim(data_table(i)%gridname)//":"//& + trim(data_table(i)%fieldname_code)) - call get_value_from_key(file_id, entry_id(i), "factor", data_table(i)%factor) - buffer = "" - call get_value_from_key(file_id, entry_id(i), "region_type", buffer, is_optional=.true.) - call check_and_set_region_type(buffer, data_table(i)%region_type) + buffer = "" + if(mentries.gt.0) then + call get_block_ids(file_id, "subregion", sub_block_id, parent_block_id=entry_id(i)) + call get_value_from_key(file_id, sub_block_id(1), "type", buffer) + endif + call check_and_set_region_type(buffer, data_table(i)%region_type) if (data_table(i)%region_type .ne. NO_REGION) then - call get_value_from_key(file_id, entry_id(i), "lon_start", data_table(i)%lon_start, is_optional=.true.) - call get_value_from_key(file_id, entry_id(i), "lon_end", data_table(i)%lon_end, is_optional=.true.) - call get_value_from_key(file_id, entry_id(i), "lat_start", data_table(i)%lat_start, is_optional=.true.) - call get_value_from_key(file_id, entry_id(i), "lat_end", data_table(i)%lat_end, is_optional=.true.) + call get_value_from_key(file_id, sub_block_id(1), "lon_start", data_table(i)%lon_start) + call get_value_from_key(file_id, sub_block_id(1), "lon_end", data_table(i)%lon_end) + call get_value_from_key(file_id, sub_block_id(1), "lat_start", data_table(i)%lat_start) + call get_value_from_key(file_id, sub_block_id(1), "lat_end", data_table(i)%lat_end) call check_valid_lat_lon(data_table(i)%lon_start, data_table(i)%lon_end, & - data_table(i)%lat_start, data_table(i)%lat_end) + data_table(i)%lat_start, data_table(i)%lat_end) endif end do @@ -1026,6 +1081,9 @@ subroutine DATA_OVERRIDE_3D_(gridname,fieldname_code,return_data,time,override,d integer :: endingj !< Ending y index for the compute domain relative to the input buffer integer :: nhalox !< Number of halos in the x direction integer :: nhaloy !< Number of halos in the y direction + logical :: found_weight_file !< .True. if the weight file has already been read + integer :: nglat !< Number of latitudes in the global domain + integer :: nglon !< Number of longitudes in the global domain use_comp_domain = .false. if(.not.module_is_initialized) & @@ -1418,18 +1476,45 @@ subroutine DATA_OVERRIDE_3D_(gridname,fieldname_code,return_data,time,override,d call mpp_error(FATAL,'error: gridname not recognized in data_override') end select - select case (data_table(index1)%interpol_method) - case ('bilinear') + if (data_table(index1)%ext_weights) then + found_weight_file = .false. + do i = 1, nweight_files + if (external_weights(i)%weight_filename .eq. trim(data_table(index1)%ext_weights_file_name)) then + override_array(curr_position)%horz_interp(window_id) = external_weights(i)%horiz_interp + found_weight_file = .true. + exit + endif + enddo + + if (.not. found_weight_file) then + nweight_files = nweight_files + 1 + external_weights(nweight_files)%weight_filename = trim(data_table(index1)%ext_weights_file_name) + + call mpp_get_global_domain(domain, xsize=nglon, ysize=nglat) + call horiz_interp_read_weights(external_weights(nweight_files)%horiz_interp, & + external_weights(nweight_files)%weight_filename, & + lon_local(isw:iew,jsw:jew), lat_local(isw:iew,jsw:jew), & + override_array(curr_position)%lon_in(is_src:ie_src+1), & + override_array(curr_position)%lat_in(js_src:je_src+1), & + data_table(index1)%ext_weights_source, & + data_table(index1)%interpol_method, isw, iew, jsw, jew, nglon, nglat) + + override_array(curr_position)%horz_interp(window_id) = external_weights(nweight_files)%horiz_interp + endif + else + select case (data_table(index1)%interpol_method) + case ('bilinear') call horiz_interp_new (override_array(curr_position)%horz_interp(window_id), & override_array(curr_position)%lon_in(is_src:ie_src+1), & override_array(curr_position)%lat_in(js_src:je_src+1), & lon_local(isw:iew,jsw:jew), lat_local(isw:iew,jsw:jew), interp_method="bilinear") - case ('bicubic') + case ('bicubic') call horiz_interp_new (override_array(curr_position)%horz_interp(window_id), & override_array(curr_position)%lon_in(is_src:ie_src+1), & override_array(curr_position)%lat_in(js_src:je_src+1), & lon_local(isw:iew,jsw:jew), lat_local(isw:iew,jsw:jew), interp_method="bicubic") - end select + end select + endif override_array(curr_position)%need_compute(window_id) = .false. endif diff --git a/horiz_interp/horiz_interp.F90 b/horiz_interp/horiz_interp.F90 index 07df2b7a6..9a910ccf1 100644 --- a/horiz_interp/horiz_interp.F90 +++ b/horiz_interp/horiz_interp.F90 @@ -54,6 +54,7 @@ module horiz_interp_mod use horiz_interp_conserve_mod, only: horiz_interp_conserve_new, horiz_interp_conserve_del use horiz_interp_bilinear_mod, only: horiz_interp_bilinear_init, horiz_interp_bilinear use horiz_interp_bilinear_mod, only: horiz_interp_bilinear_new, horiz_interp_bilinear_del +use horiz_interp_bilinear_mod, only: horiz_interp_read_weights_bilinear use horiz_interp_bicubic_mod, only: horiz_interp_bicubic_init, horiz_interp_bicubic use horiz_interp_bicubic_mod, only: horiz_interp_bicubic_new, horiz_interp_bicubic_del use horiz_interp_spherical_mod, only: horiz_interp_spherical_init, horiz_interp_spherical @@ -66,7 +67,7 @@ module horiz_interp_mod !---- interfaces ---- public horiz_interp_type, horiz_interp, horiz_interp_new, horiz_interp_del, & - horiz_interp_init, horiz_interp_end, assignment(=) + horiz_interp_init, horiz_interp_end, assignment(=), horiz_interp_read_weights !> Allocates space and initializes a derived-type variable !! that contains pre-computed interpolation indices and weights. @@ -137,6 +138,12 @@ module horiz_interp_mod module procedure horiz_interp_new_1d_dst_r8 end interface + !> Subroutines for reading in weight files and using that to fill in the horiz_interp type instead + !! calculating it + interface horiz_interp_read_weights + module procedure horiz_interp_read_weights_r4 + module procedure horiz_interp_read_weights_r8 + end interface horiz_interp_read_weights !> Subroutine for performing the horizontal interpolation between two grids. !! diff --git a/horiz_interp/horiz_interp_bilinear.F90 b/horiz_interp/horiz_interp_bilinear.F90 index 2fe80b989..d8db732b2 100644 --- a/horiz_interp/horiz_interp_bilinear.F90 +++ b/horiz_interp/horiz_interp_bilinear.F90 @@ -35,13 +35,15 @@ module horiz_interp_bilinear_mod use horiz_interp_type_mod, only: horiz_interp_type, stats, BILINEAR use platform_mod, only: r4_kind, r8_kind use axis_utils2_mod, only: nearest_index + use fms2_io_mod, only: open_file, close_file, read_data, FmsNetcdfFile_t, get_dimension_size + use fms_string_utils_mod, only: string implicit none private public :: horiz_interp_bilinear_new, horiz_interp_bilinear, horiz_interp_bilinear_del - public :: horiz_interp_bilinear_init + public :: horiz_interp_bilinear_init, horiz_interp_read_weights_bilinear !> Creates a @ref horiz_interp_type for bilinear interpolation. !> @ingroup horiz_interp_bilinear_mod @@ -52,6 +54,14 @@ module horiz_interp_bilinear_mod module procedure horiz_interp_bilinear_new_2d_r8 end interface + !> Subroutines for reading in weight files and using that to fill in the horiz_interp type instead + !! calculating it + !> @ingroup horiz_interp_bilinear_mod + interface horiz_interp_read_weights_bilinear + module procedure horiz_interp_read_weights_bilinear_r4 + module procedure horiz_interp_read_weights_bilinear_r8 + end interface + interface horiz_interp_bilinear module procedure horiz_interp_bilinear_r4 module procedure horiz_interp_bilinear_r8 diff --git a/horiz_interp/include/horiz_interp.inc b/horiz_interp/include/horiz_interp.inc index 036b87a26..c3fe335b1 100644 --- a/horiz_interp/include/horiz_interp.inc +++ b/horiz_interp/include/horiz_interp.inc @@ -840,4 +840,52 @@ return end function IS_LAT_LON_ + + !> Subroutine for reading a weight file and use it to fill in the horiz interp type +!! for the bilinear interpolation method. + subroutine HORIZ_INTERP_READ_WEIGHTS_(Interp, weight_filename, lon_out, lat_out, lon_in, lat_in, & + weight_file_source, interp_method, isw, iew, jsw, jew, nglon, nglat) + type(horiz_interp_type), intent(inout) :: Interp !< Horiz interp time to fill + character(len=*), intent(in) :: weight_filename !< Name of the weight file + real(FMS_HI_KIND_), intent(in) :: lat_out(:,:) !< Output (model) latitude + real(FMS_HI_KIND_), intent(in) :: lon_out(:,:) !< Output (model) longitude + real(FMS_HI_KIND_), intent(in) :: lat_in(:) !< Input (data) latitude + real(FMS_HI_KIND_), intent(in) :: lon_in(:) !< Input (data) longitude + character(len=*), intent(in) :: weight_file_source !< Source of the weight file + character(len=*), intent(in) :: interp_method !< The interp method to use + integer, intent(in) :: isw, iew, jsw, jew !< Starting and ending indices of the compute domain + integer, intent(in) :: nglon !< Number of longitudes in the global domain + integer, intent(in) :: nglat !< Number of latitudes in the globl domain + + integer :: i, j !< For do loops + integer :: nlon_in !< Number of longitude in the data + integer :: nlat_in !< Number of latitude in the data grid + real(FMS_HI_KIND_), allocatable :: lon_src_1d(:) !< Center points of the longitude data grid + real(FMS_HI_KIND_), allocatable :: lat_src_1d(:) !< Center points of the lattiude data grid + integer, parameter :: kindl = FMS_HI_KIND_ !< real kind size currently compiling + + select case (trim(interp_method)) + case ("bilinear") + !! This is to reproduce the behavior in horiz_interp_new + !! The subroutine assumes that the data grid (lon_in, lat_in) are + !! the edges and not the centers. + !! Data_override passes in the edges, which are calculated using the axis_edges subroutine + nlon_in = size(lon_in(:))-1; nlat_in = size(lat_in(:))-1 + allocate(lon_src_1d(nlon_in), lat_src_1d(nlat_in)) + do i = 1, nlon_in + lon_src_1d(i) = (lon_in(i) + lon_in(i+1)) * 0.5_kindl + enddo + do j = 1, nlat_in + lat_src_1d(j) = (lat_in(j) + lat_in(j+1)) * 0.5_kindl + enddo + + call horiz_interp_read_weights_bilinear(Interp, weight_filename, lon_out, lat_out, & + lon_src_1d, lat_src_1d, weight_file_source, interp_method, & + isw, iew, jsw, jew, nglon, nglat) + deallocate(lon_src_1d,lat_src_1d) + case default + call mpp_error(FATAL, "Reading weight from file is not supported for the "//& + trim(interp_method)//" method. It is currently only supported for bilinear") + end select + end subroutine HORIZ_INTERP_READ_WEIGHTS_ !> @} diff --git a/horiz_interp/include/horiz_interp_bilinear.inc b/horiz_interp/include/horiz_interp_bilinear.inc index f178ebec1..f998b823f 100644 --- a/horiz_interp/include/horiz_interp_bilinear.inc +++ b/horiz_interp/include/horiz_interp_bilinear.inc @@ -1209,4 +1209,93 @@ return end subroutine + + !> Subroutine for reading a weight file and use it to fill in the horiz interp type + !! for the bilinear interpolation method. + subroutine HORIZ_INTERP_READ_WEIGHTS_BILINEAR_(Interp, weight_filename, lon_out, lat_out, lon_in, lat_in, & + weight_file_source, interp_method, isw, iew, jsw, jew, nglon, nglat) + type(horiz_interp_type), intent(inout) :: Interp !< Horiz interp time to fill + character(len=*), intent(in) :: weight_filename !< Name of the weight file + real(FMS_HI_KIND_), target, intent(in) :: lat_out(:,:) !< Output (model) latitude + real(FMS_HI_KIND_), target, intent(in) :: lon_out(:,:) !< Output (model) longitude + real(FMS_HI_KIND_), intent(in) :: lat_in(:) !< Input (data) latitude + real(FMS_HI_KIND_), intent(in) :: lon_in(:) !< Input (data) longitude + character(len=*), intent(in) :: weight_file_source !< Source of the weight file + character(len=*), intent(in) :: interp_method !< The interp method to use + integer, intent(in) :: isw, iew, jsw, jew !< Starting and ending indices of the compute domain + integer, intent(in) :: nglon !< Number of longitudes in the global domain + integer, intent(in) :: nglat !< Number of latitudes in the globl domain + + + real(FMS_HI_KIND_), allocatable :: var(:,:,:) !< Dummy variable to read the indices and weight into + type(FmsNetcdfFile_t) :: weight_fileobj !< FMS2io fileob for the weight file + integer :: nlon !< Number of longitudes in the model grid as read + !! from the weight file + integer :: nlat !< Number of latitude in the model grid as read + !! from the weight file + + if (.not. open_file(weight_fileobj, weight_filename, "read" )) & + call mpp_error(FATAL, "Error opening the weight file:"//& + &trim(weight_filename)) + + !< Check that weight file has the correct dimensions + select case (trim(weight_file_source)) + case ("fregrid") + call get_dimension_size(weight_fileobj, "nlon", nlon) + if (nlon .ne. nglon) & + call mpp_error(FATAL, "The nlon from the weight file is not the same as in the input grid."//& + &" From weight file:"//string(nlon)//" from input grid:"//string(size(lon_out,1))) + call get_dimension_size(weight_fileobj, "nlat", nlat) + if (nlat .ne. nglat) & + call mpp_error(FATAL, "The nlat from the weight file is not the same as in the input grid."//& + &" From weight file:"//string(nlat)//" from input grid:"//string(size(lon_out,2))) + case default + call mpp_error(FATAL, trim(weight_file_source)//& + &" is not a supported weight file source. fregrid is the only supported weight file source." ) + end select + + Interp%nlon_src = size(lon_in(:)) ; Interp%nlat_src = size(lat_in(:)) + Interp%nlon_dst = size(lon_out,1); Interp%nlat_dst = size(lon_out,2) + + allocate ( Interp % HI_KIND_TYPE_ % wti (Interp%nlon_dst,Interp%nlat_dst,2), & + Interp % HI_KIND_TYPE_ % wtj (Interp%nlon_dst,Interp%nlat_dst,2), & + Interp % i_lon (Interp%nlon_dst,Interp%nlat_dst,2), & + Interp % j_lat (Interp%nlon_dst,Interp%nlat_dst,2)) + + + !! Three is for lon, lat, tile + !! Currently, interpolation is only supported from lat,lon input data + allocate(var(Interp%nlon_dst,Interp%nlat_dst, 3)) + call read_data(weight_fileobj, "index", var, corner=(/isw, jsw, 1/), edge_lengths=(/iew-isw+1, jew-jsw+1, 3/)) + + !! Each point has a lon (i), and lat(j) index + !! From there the four corners are (i,j), (i,j+1) (i+1) (i+1,j+1) + Interp % i_lon (:,:,1) = var(:,:,1) + Interp % i_lon (:,:,2) = Interp % i_lon (:,:,1) + 1 + where (Interp % i_lon (:,:,2) > size(lon_in(:))) Interp % i_lon (:,:,2) = 1 + + Interp % j_lat (:,:,1) = var(:,:,2) + Interp % j_lat (:,:,2) = Interp % j_lat (:,:,1) + 1 + where (Interp % j_lat (:,:,2) > size(lat_in(:))) Interp % j_lat (:,:,2) = 1 + + deallocate(var) + + allocate(var(Interp%nlon_dst,Interp%nlat_dst, 4)) + call read_data(weight_fileobj, "weight", var, corner=(/isw, jsw, 1/), edge_lengths=(/iew-isw+1, jew-jsw+1, 4/)) + + !! The weights for the four corners + !! var(:,:,1) -> (i,j) + !! var(:,:,2) -> (i,j+1) + !! var(:,:,3) -> (i+1,j) + !! var(:,:,4) -> (i+1,j+1) + Interp % HI_KIND_TYPE_ % wti = var(:,:,1:2) + Interp % HI_KIND_TYPE_ % wtj = var(:,:,3:4) + deallocate(var) + + Interp% HI_KIND_TYPE_ % is_allocated = .true. + Interp% interp_method = BILINEAR + Interp% I_am_initialized = .True. + call close_file(weight_fileobj) + end subroutine HORIZ_INTERP_READ_WEIGHTS_BILINEAR_ + !> @} diff --git a/horiz_interp/include/horiz_interp_bilinear_r4.fh b/horiz_interp/include/horiz_interp_bilinear_r4.fh index 8880914e4..36c462a05 100644 --- a/horiz_interp/include/horiz_interp_bilinear_r4.fh +++ b/horiz_interp/include/horiz_interp_bilinear_r4.fh @@ -45,5 +45,8 @@ #undef INTERSECT_ #define INTERSECT_ intersect_r4 +#undef HORIZ_INTERP_READ_WEIGHTS_BILINEAR_ +#define HORIZ_INTERP_READ_WEIGHTS_BILINEAR_ horiz_interp_read_weights_bilinear_r4 + #include "horiz_interp_bilinear.inc" !> @} diff --git a/horiz_interp/include/horiz_interp_bilinear_r8.fh b/horiz_interp/include/horiz_interp_bilinear_r8.fh index 37a2e6920..05187557f 100644 --- a/horiz_interp/include/horiz_interp_bilinear_r8.fh +++ b/horiz_interp/include/horiz_interp_bilinear_r8.fh @@ -45,5 +45,8 @@ #undef INTERSECT_ #define INTERSECT_ intersect_r8 +#undef HORIZ_INTERP_READ_WEIGHTS_BILINEAR_ +#define HORIZ_INTERP_READ_WEIGHTS_BILINEAR_ horiz_interp_read_weights_bilinear_r8 + #include "horiz_interp_bilinear.inc" !> @} diff --git a/horiz_interp/include/horiz_interp_r4.fh b/horiz_interp/include/horiz_interp_r4.fh index a3211ee6e..89b3e6055 100644 --- a/horiz_interp/include/horiz_interp_r4.fh +++ b/horiz_interp/include/horiz_interp_r4.fh @@ -60,5 +60,8 @@ #undef IS_LAT_LON_ #define IS_LAT_LON_ is_lat_lon_r4 +#undef HORIZ_INTERP_READ_WEIGHTS_ +#define HORIZ_INTERP_READ_WEIGHTS_ horiz_interp_read_weights_r4 + #include "horiz_interp.inc" !> @} diff --git a/horiz_interp/include/horiz_interp_r8.fh b/horiz_interp/include/horiz_interp_r8.fh index 713be9206..312a31403 100644 --- a/horiz_interp/include/horiz_interp_r8.fh +++ b/horiz_interp/include/horiz_interp_r8.fh @@ -60,5 +60,8 @@ #undef IS_LAT_LON_ #define IS_LAT_LON_ is_lat_lon_r8 +#undef HORIZ_INTERP_READ_WEIGHTS_ +#define HORIZ_INTERP_READ_WEIGHTS_ horiz_interp_read_weights_r8 + #include "horiz_interp.inc" !> @} diff --git a/test_fms/data_override/Makefile.am b/test_fms/data_override/Makefile.am index 69f09540f..087bd91ea 100644 --- a/test_fms/data_override/Makefile.am +++ b/test_fms/data_override/Makefile.am @@ -73,11 +73,11 @@ TESTS_ENVIRONMENT= test_input_path="@TEST_INPUT_PATH@" \ # Run the test program. TESTS = test_data_override2.sh test_data_override_init.sh test_data_override2_mono.sh test_data_override2_ongrid.sh \ - test_data_override2_scalar.sh + test_data_override2_scalar.sh test_data_override_weights.sh # Include these files with the distribution. EXTRA_DIST = test_data_override2.sh test_data_override_init.sh test_data_override2_mono.sh test_data_override2_ongrid.sh \ - test_data_override2_scalar.sh + test_data_override2_scalar.sh test_data_override_weights.sh # Clean up CLEANFILES = input.nml *.nc* *.out diag_table data_table data_table.yaml INPUT/* *.dpi *.spi *.dyn *.spl *-files/* diff --git a/test_fms/data_override/test_data_override2_mono.sh b/test_fms/data_override/test_data_override2_mono.sh index cf47a152f..be1cce410 100755 --- a/test_fms/data_override/test_data_override2_mono.sh +++ b/test_fms/data_override/test_data_override2_mono.sh @@ -59,17 +59,19 @@ _EOF cat <<_EOF > data_table.yaml data_table: -- gridname: OCN - fieldname_code: runoff_increasing - fieldname_file: runoff - file_name: ./INPUT/bilinear_increasing.nc - interpol_method: bilinear +- grid_name: OCN + fieldname_in_model: runoff_increasing + override_file: + - fieldname_in_file: runoff + file_name: ./INPUT/bilinear_increasing.nc + interp_method: bilinear factor: 1.0 -- gridname: OCN - fieldname_code: runoff_decreasing - fieldname_file: runoff - file_name: ./INPUT/bilinear_decreasing.nc - interpol_method: bilinear +- grid_name: OCN + fieldname_in_model: runoff_decreasing + override_file: + - fieldname_in_file: runoff + file_name: ./INPUT/bilinear_decreasing.nc + interp_method: bilinear factor: 1.0 _EOF diff --git a/test_fms/data_override/test_data_override2_ongrid.sh b/test_fms/data_override/test_data_override2_ongrid.sh index 2e1d7a1b0..e9f36712c 100755 --- a/test_fms/data_override/test_data_override2_ongrid.sh +++ b/test_fms/data_override/test_data_override2_ongrid.sh @@ -52,12 +52,13 @@ use_data_table_yaml=.True. _EOF cat <<_EOF > data_table.yaml data_table: - - gridname : OCN - fieldname_code : runoff - fieldname_file : runoff - file_name : INPUT/runoff.daitren.clim.1440x1080.v20180328.nc - interpol_method : none - factor : 1.0 + - grid_name: OCN + fieldname_in_model: runoff + override_file: + - fieldname_in_file: runoff + file_name: INPUT/runoff.daitren.clim.1440x1080.v20180328.nc + interp_method: none + factor: 1.0 _EOF fi @@ -83,4 +84,4 @@ test_expect_success "data_override get_grid_v1 (${KIND})" ' done rm -rf INPUT *.nc # remove any leftover files to reduce size -test_done \ No newline at end of file +test_done diff --git a/test_fms/data_override/test_data_override2_scalar.sh b/test_fms/data_override/test_data_override2_scalar.sh index faf9aca08..ac19b2b0a 100755 --- a/test_fms/data_override/test_data_override2_scalar.sh +++ b/test_fms/data_override/test_data_override2_scalar.sh @@ -48,11 +48,12 @@ use_data_table_yaml=.True. _EOF cat <<_EOF > data_table.yaml data_table: - - gridname : OCN - fieldname_code : co2 - fieldname_file : co2 - file_name : INPUT/scalar.nc - interpol_method : none + - grid_name: OCN + fieldname_in_model: co2 + override_file: + - fieldname_in_file: co2 + file_name: INPUT/scalar.nc + interp_method: none factor : 1.0 _EOF fi @@ -68,4 +69,4 @@ test_expect_success "data_override scalar field (${KIND})" ' done rm -rf INPUT *.nc # remove any leftover files to reduce size -test_done \ No newline at end of file +test_done diff --git a/test_fms/data_override/test_data_override_ongrid.F90 b/test_fms/data_override/test_data_override_ongrid.F90 index 4345bb9f8..a05eb9d6c 100644 --- a/test_fms/data_override/test_data_override_ongrid.F90 +++ b/test_fms/data_override/test_data_override_ongrid.F90 @@ -39,8 +39,8 @@ program test_data_override_ongrid integer, parameter :: lkind = DO_TEST_KIND_ integer, dimension(2) :: layout = (/2,3/) !< Domain layout -integer :: nlon !< Number of points in x axis -integer :: nlat !< Number of points in y axis +integer :: nlon = 360 !< Number of points in x axis +integer :: nlat = 180 !< Number of points in y axis type(domain2d) :: Domain !< Domain with mask table integer :: is !< Starting x index integer :: ie !< Ending x index @@ -51,9 +51,10 @@ program test_data_override_ongrid integer, parameter :: ongrid = 1 integer, parameter :: bilinear = 2 integer, parameter :: scalar = 3 +integer, parameter :: weight_file = 4 integer :: test_case = ongrid -namelist / test_data_override_ongrid_nml / nhalox, nhaloy, test_case +namelist / test_data_override_ongrid_nml / nhalox, nhaloy, test_case, nlon, nlat, layout call mpp_init call fms2_io_init @@ -61,8 +62,6 @@ program test_data_override_ongrid read (input_nml_file, test_data_override_ongrid_nml, iostat=io_status) if (io_status > 0) call mpp_error(FATAL,'=>test_data_override_ongrid: Error reading input.nml') - - !< Wait for the root PE to catch up call mpp_sync @@ -70,9 +69,6 @@ program test_data_override_ongrid call set_calendar_type(NOLEAP) -nlon = 360 -nlat = 180 - !< Create a domain nlonXnlat with mask call mpp_domains_set_stack_size(17280000) call mpp_define_domains( (/1,nlon,1,nlat/), layout, Domain, xhalo=nhalox, yhalo=nhaloy, name='test_data_override_emc') @@ -86,6 +82,8 @@ program test_data_override_ongrid call generate_bilinear_input_file () case (scalar) call generate_scalar_input_file () +case (weight_file) + call generate_weight_input_file () end select call mpp_sync() @@ -101,6 +99,8 @@ program test_data_override_ongrid call bilinear_test() case (scalar) call scalar_test() +case (weight_file) + call weight_file_test() end select call mpp_exit @@ -443,6 +443,99 @@ subroutine bilinear_test() deallocate(runoff_decreasing, runoff_increasing) end subroutine bilinear_test +subroutine generate_weight_input_file() + call create_grid_spec_file () + call create_ocean_mosaic_file() + call create_ocean_hgrid_file() + call create_bilinear_data_file(.true.) + call create_weight_file() +end subroutine + +subroutine create_weight_file() + type(FmsNetcdfFile_t) :: fileobj + real(kind=r8_kind), allocatable :: vdata(:,:,:) + character(len=5) :: dim_names(3) + + dim_names(1) = "nlon" + dim_names(2) = "nlat" + if (open_file(fileobj, "INPUT/remap_file.nc", "overwrite")) then + call register_axis(fileobj, "nlon", nlon) + call register_axis(fileobj, "nlat", nlat) + call register_axis(fileobj, "three", 3) + call register_axis(fileobj, "four", 4) + + dim_names(3) = "three" + call register_field(fileobj, "index", "int", dim_names) + + dim_names(3) = "four" + call register_field(fileobj, "weight", "double", dim_names) + + allocate(vdata(nlon,nlat,3)) + vdata(1,:,1) = 1 + vdata(2,:,1) = 2 + vdata(3,:,1) = 3 + vdata(4,:,1) = 4 + vdata(5,:,1) = 5 + vdata(:,1:2,2) = 1 + vdata(:,3,2) = 2 + vdata(:,4,2) = 3 + vdata(:,5,2) = 4 + vdata(:,6,2) = 5 + vdata(:,:,3) = 1 + call write_data(fileobj, "index", vdata) + deallocate(vdata) + + allocate(vdata(nlon,nlat,4)) + vdata = 0.5_r8_kind + vdata(:,1,3) = 1_r8_kind + vdata(:,6,3) = 1_r8_kind + vdata(:,1,4) = 0_r8_kind + vdata(:,6,4) = 0_r8_kind + + call write_data(fileobj, "weight", vdata) + deallocate(vdata) + + call close_file(fileobj) + endif +end subroutine create_weight_file + +subroutine weight_file_test() + type(time_type) :: Time !< Time + real(lkind), allocatable, dimension(:,:) :: runoff !< Data from normal override + real(lkind), allocatable, dimension(:,:) :: runoff_weight !< Data from weight file override + real(lkind) :: threshold !< Threshold for the difference in answers + + integer :: i, j, k + logical :: success + + allocate(runoff(is:ie,js:je)) + allocate(runoff_weight(is:ie,js:je)) + + runoff = 999_lkind + runoff_weight = 999_lkind + Time = set_date(1,1,4,0,0,0) + call data_override('OCN','runoff_obs',runoff, Time, override=success) + if (.not. success) call mpp_error(FATAL, "Data override failed") + call data_override('OCN','runoff_obs_weights',runoff_weight, Time, override=success) + if (.not. success) call mpp_error(FATAL, "Data override failed") + + threshold = 1e-09 + if (lkind .eq. 4) then + threshold = 1e-03 + endif + + do i = is, ie + do j = js, je + if (abs(runoff(i,j) - runoff_weight(i,j)) .gt. threshold) then + call mpp_error(FATAL, "The data is not the same: "// & + string(i)//","//string(j)//":"// & + string(runoff(i,j))//" vs "//string(runoff_weight(i,j))) + endif + enddo + enddo + deallocate(runoff, runoff_weight) +end subroutine weight_file_test + !> @brief Generates the input for the bilinear data_override test_case subroutine generate_scalar_input_file() if (mpp_pe() .eq. mpp_root_pe()) then diff --git a/test_fms/data_override/test_data_override_weights.sh b/test_fms/data_override/test_data_override_weights.sh new file mode 100755 index 000000000..a3bc8902e --- /dev/null +++ b/test_fms/data_override/test_data_override_weights.sh @@ -0,0 +1,76 @@ +#!/bin/sh + +#*********************************************************************** +#* GNU Lesser General Public License +#* +#* This file is part of the GFDL Flexible Modeling System (FMS). +#* +#* FMS is free software: you can redistribute it and/or modify it under +#* the terms of the GNU Lesser General Public License as published by +#* the Free Software Foundation, either version 3 of the License, or (at +#* your option) any later version. +#* +#* FMS is distributed in the hope that it will be useful, but WITHOUT +#* ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or +#* FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License +#* for more details. +#* +#* You should have received a copy of the GNU Lesser General Public +#* License along with FMS. If not, see . +#*********************************************************************** +# +# Copyright (c) 2019-2021 Ed Hartnett, Uriel Ramirez, Seth Underwood + +# Set common test settings. +. ../test-lib.sh + +output_dir +[ ! -d "INPUT" ] && mkdir -p "INPUT" + +cat <<_EOF > data_table.yaml +data_table: +- grid_name: OCN + fieldname_in_model: runoff_obs + override_file: + - fieldname_in_file: runoff + file_name: ./INPUT/bilinear_increasing.nc + interp_method: bilinear + factor: 1.0 +- grid_name: OCN + fieldname_in_model: runoff_obs_weights + override_file: + - fieldname_in_file: runoff + file_name: ./INPUT/bilinear_increasing.nc + interp_method: bilinear + external_weights: + - file_name: ./INPUT/remap_file.nc + source: fregrid + factor: 1.0 +_EOF + +cat <<_EOF > input.nml +&data_override_nml + use_data_table_yaml = .True. +/ + +&test_data_override_ongrid_nml + test_case = 4 + nlon = 5 + nlat = 6 + layout = 1, 2 +/ +_EOF + +#The test only runs with yaml +if [ -z $parser_skip ]; then + for KIND in r4 r8 + do + rm -rf INPUT/. + test_expect_success "test_data_override with and without weight files -yaml (${KIND})" ' + mpirun -n 2 ../test_data_override_ongrid_${KIND} + ' + done +fi + +rm -rf INPUT +test_done