diff --git a/dist/mhkit_python_utils-0.1.0-py3.7.egg b/dist/mhkit_python_utils-0.1.0-py3.7.egg new file mode 100644 index 00000000..16b05312 Binary files /dev/null and b/dist/mhkit_python_utils-0.1.0-py3.7.egg differ diff --git a/examples/river_example.html b/examples/river_example.html index 1ff21be4..ee663a26 100644 --- a/examples/river_example.html +++ b/examples/river_example.html @@ -92,10 +92,10 @@ % they are most often exceeded. % Calculating the exceedance probability structure (F) from the Discharge data structure -F = exceedance_probability(data); +F = exceedance_probability(data.Discharge) % plotting the flow duration curve from Discharge and Exceedance Probability vectors -plot_flow_duration_curve(data.Discharge,F.F); +plot_flow_duration_curve(data.Discharge,F); %% Velocity Duration Curve % At each river energy converter location, we must provide a curve that relates % the velocity at the turbine location to the river discharge levels. IEC 301 @@ -129,10 +129,10 @@ % Use polynomial fit from DV curve to calculate velocity('V') from % discharge at turbine location -V = discharge_to_velocity(data, p.coef); % Input of discharge structure and polynomial coeficients +V = discharge_to_velocity(data, p.coef) % Input of discharge structure and polynomial coeficients % Plot the velocity duration curve (VDC) -p2 = plot_velocity_duration_curve(V.V, F.F); % Input velocity vector and exceedance probability vector +p2 = plot_velocity_duration_curve(V.V, F); % Input velocity vector and exceedance probability vector %% Power Duration Curve % The power duration curve is created in a nearly identical manner to the VDC. % Here a velocity to power curve will be used to create a polynomial fit. The @@ -155,7 +155,7 @@ P = velocity_to_power(V,p_VP.coef,min(VP_curve.V), max(VP_curve.V)); % Input is velocity structure, poly-coef, min and max of VP curve % Plot the power duration curve -p4 = plot_power_duration_curve(P.P, F.F); % Input of power vector and exceedance probability vector +p4 = plot_power_duration_curve(P.P, F); % Input of power vector and exceedance probability vector %% Annual Energy Produced % Finally, annual energy produced is computed, as shown below. diff --git a/examples/river_example.mlx b/examples/river_example.mlx index 1820fbfd..579e97f7 100644 Binary files a/examples/river_example.mlx and b/examples/river_example.mlx differ diff --git a/examples/tidal_example.html b/examples/tidal_example.html index 4ac18ca5..865098b3 100644 --- a/examples/tidal_example.html +++ b/examples/tidal_example.html @@ -1,12 +1,48 @@ -Example: MHKiT-MATLAB Tidal Module

Example: MHKiT-MATLAB Tidal Module

The following example will familiarize the user with the MHKiT tidal module by stepping through the calculation of the velocity duration curve. The data file used in this example is stored in the \MHKiT\examples\data directory.

Load Data from NOAA-Currents

This example uses 1 year of data from the NOAA-Currents sites. A map of available currents stations is available at https://tidesandcurrents.noaa.gov/map/. The tidal io module includes two functions to import data: request_noaa_data which pulls data from the website, and read_noaa_json which loads a json file. The request function can save the json file for later use.
For simplicity, this example loads data from a json file into a MATLAB Series. This data contains 1 year of 6-minute averaged data from the Southampton Shoal Channel LB 6 (Station Number: s08010) in San Francisco Bay. The data includes 6-minute averaged direction [degrees] and speed [cm/s] indexed by time. The structure names returned by NOAA are 'd' for direction and 's' for speed. Since MHKIT uses SI units, speed is converted to m/s.
% Create a file name to save the requested data to
relative_file_name = "data/tidal/s08010.json"; % South Hampton Shoal LB
current_dir = fileparts(matlab.desktop.editor.getActiveFilename);
full_file_name = fullfile(current_dir, relative_file_name);
 
data = read_noaa_json(full_file_name)
% Example of how you could request NOAA-Currents data
%data = request_noaa_data("s08010","currents","20161101","20180401","write_json",currents_file);
 
 
% Convert discharge data from cm/s to m/s
data.s = data.s/100.;
The request_noaa_data has returned a Matlab Structure of time series data between November of 2016 and March of 2018. The time series data points are 6-minute averaged direction [degrees] and speed [cm/s] data indexed by time. The structure key names returned by NOAA are 'd' for direction and 's' for speed. The speed returned from NOAA is in cm/s so we converted this to m/s.

Principal Flow Directions

An initial data check may look at the time series data to identify data gaps. To consider the velocity in one of the principal flow directions we apply the principal_flow_directions function. This function returns 2 directions (in degrees) corresponding to the flood and ebb directions of the tidal site. Principal flow directions are calculated based on the highest frequency directions. These directions are often close to 180 degrees apart, but are not required to be.
The plot_current_timeseries function plots velocity in either direction using the speed timeseries.
% We need to specify a histogram bin width for the directions to calculate
% the principal flow direction
width_direction = 1; % [degrees]
 
% Two principal directions are calculated based on the highest frequency
% direction
% One direction is the flood, the other is tide or ebb.
% they are 180 degrees apart, but not required to be
[direction1, direction2] = principal_flow_directions(data.d,width_direction);
 
% Set the flood and ebb directions based on site knowledge
flood = direction1 ; %flow into
ebb = direction2 ; % flow out
 
 
%plot the time series of current data using the flood direction
plot_current_timeseries(data,flood);
The plot above shows that the NOAA site request did not return data for most of early and mid-2017. The IEC standard recommends a minimum of 1 year of 10-minute averaged data (See IEC 201 for full description). For the purposes of demonstration this dataset is sufficient. To look at a specific month we can slice the dataset before passing to the plotting function.
% Slice December of 2017 out of the full dataset
start = posixtime(datetime('2017-12-01','InputFormat','yyyy-MM-dd'));
stop = posixtime(datetime('2017-12-31','InputFormat','yyyy-MM-dd'));
index = data.time > start & data.time < stop ;
datadec.time = data.time(index);
datadec.s = data.s(index);
datadec.d = data.d(index);
 
plot_current_timeseries(datadec,flood);

Joint Probability Distribution

Direction and velocity can be viewed as a joint probability distribution on a polar plot. This plot helps visually show the flood and ebb directions and the frequency of particular directional velocities.
% Set the joint probability bit widths
width_direction = 1 ; %[degrees]
width_velocity = 0.1 ; %[m/s]
 
% Plot the joint probability distribution
figure;
plot_joint_probability_distribution(data,width_direction,width_velocity);

Rose Plot

A rose plot shows the same information as the joint probability distribution, but the probability is now the r-axis, and the velocity is the contour value. As compared to a joint probability distribution plot, a rose plot can be more readable when using larger bins sizes.
% A rose plot is more readable using larger bin sizes than the jpd plot
width_direction = 10; % [degrees]
width_velocity = 0.25; %[m/s]
 
%plot the rose plot
%f2= figure;
figure;
plot_rose(data,width_direction,width_velocity,"flood_ebb",[flood,ebb]);

Velocity Duration Curve

The velocity duration curve shows the probability of achieving a particular velocity value. After computing the exceedance probability, the rank order of velocity values can be plotted as follows.
% Calculate the exceedance probability of the data
data.Discharge = data.s; % need to change structure name for exceedance probability function
F = exceedance_probability(data);
 
% Plot the VDC
 
plot_velocity_duration_curve(data.s,F.F);
+Example: MHKiT-MATLAB Tidal Module

Example: MHKiT-MATLAB Tidal Module

The following example will familiarize the user with the MHKiT tidal module by stepping through the calculation of the velocity duration curve. The data file used in this example is stored in the \MHKiT\examples\data directory.

Load Data from NOAA-Currents

This example uses 1 year of data from the NOAA-Currents sites. A map of available currents stations is available at https://tidesandcurrents.noaa.gov/map/. The tidal io module includes two functions to import data: request_noaa_data which pulls data from the website, and read_noaa_json which loads a json file. The request function can save the json file for later use.
For simplicity, this example loads data from a json file into a MATLAB Series. This data contains 1 year of 6 minute averaged data from the Southampton Shoal Channel LB 6 (Station Number: s08010) in San Francisco Bay. The data includes 6 minute averaged direction [degrees] and speed [cm/s] indexed by time. The structure names returned by NOAA are 'd' for direction and 's' for speed. Since MHKIT uses SI units, speed is converted to m/s.
% Create a file name to save the requested data to
relative_file_name = "data/tidal/s08010.json"; % South Hampton Shoal LB
current_dir = fileparts(matlab.desktop.editor.getActiveFilename);
full_file_name = fullfile(current_dir, relative_file_name);
data = read_noaa_json(full_file_name)
data = struct with fields:
id: "s08010" + name: "Southampton Shoal Channel LB 6" + lat: "37.9162" + lon: "-122.4223" + s: [18890×1 double] + d: [18890×1 double] + b: [18890×1 double] + time: [1×18890 double] +
% Example of how you could request NOAA-Currents data
%data = request_noaa_data("s08010","currents","20161101","20180401","write_json",currents_file);
% Convert discharge data from cm/s to m/s
data.s = data.s/100.;
The request_noaa_data has returned a Matlab Structure of time series data between November of 2016 and March of 2018. The time series data points are 6 minute averaged direction [degrees] and speed [cm/s] data indexed by time. The structure key names returned by NOAA are 'd' for direction and 's' for speed. The speed returned from NOAA is in cm/s so we converted this to m/s.

Principal Flow Directions

An initial data check may look at the time series data to identify data gaps. To consider the velocity in one of the principal flow directions we apply the principal_flow_directions function. This function returns 2 directions (in degrees) corresponding to the flood and ebb directions of the tidal site. Principal flow directions are calculated based on the highest frequency directions. These directions are often close to 180 degrees apart, but are not required to be.
The plot_current_timeseries function plots velocity in either direction using the speed timeseries.
% We need to specify a histogram bin width for the directions to calculate
% the principal flow direction
width_direction = 1; % [degrees]
% Two principal directions are calculated based on the highest frequency
% direction
% One direction is the flood, the other is tide or ebb.
% they are 180 degrees apart, but not required to be
[direction1, direction2] = principal_flow_directions(data.d,width_direction);
% Set the flood and ebb directions based on site knowledge
flood = direction1 ; %flow into
ebb = direction2 ; % flow out
%plot the time series of current data using the flood direction
plot_current_timeseries(data,flood);
The plot above shows that the NOAA site request did not return data for most of early and mid 2017. The IEC standard reccomends a minimum of 1 year of 10 minute averaged data (See IEC 201 for full description). For the purposes of demonstration this dataset is suffcient. To look at a speficic month we can slice the dataset before passing to the plotting function.
% Slice December of 2017 out of the full dataset
start = posixtime(datetime('2017-12-01','InputFormat','yyyy-MM-dd'));
stop = posixtime(datetime('2017-12-31','InputFormat','yyyy-MM-dd'));
index = data.time > start & data.time < stop ;
datadec.time = data.time(index);
datadec.s = data.s(index);
datadec.d = data.d(index);
plot_current_timeseries(datadec,flood);

Joint Probability Distribution

Direction and velocity can be viewed as a joint probability distribution on a polar plot. This plot helps visually show the flood and ebb directions and the frequency of particular directional velocities.
% Set the joint probability bit widths
width_direction = 1 ; %[degrees]
width_velocity = 0.1 ; %[m/s]
% Plot the joint probability distribution
figure;
plot_joint_probability_distribution(data,width_direction,width_velocity);

Rose Plot

A rose plot shows the same information as the joint probability distribution but the probability is now the r-axis, and the velocity is the contour value. As compared to a joint probability distribution plot, a rose plot can be more readable when using larger bins sizes.
% A rose plot is more readable using larger bin sizes than the jpd plot
width_direction = 10; % [degrees]
width_velocity = 0.25; %[m/s]
%plot the rose plot
%f2= figure;
figure;
plot_rose(data,width_direction,width_velocity,"flood_ebb",[flood,ebb]);

Velocity Duration Curve

The velocity duration curve shows the probability of achieving a particular velocity value. After computing the exceedance probability, the rank order of velocity values can be plotted as follows.
% Calculate the exceedance probability of the data
data.Discharge = data.s; % need to change structure name for exceedance probability function
F = exceedance_probability(data);
% Plot the VDC
plot_velocity_duration_curve(data.s,F.F);

Plot by Phase Direction

MHKiT can produce plots of velocity by probability and exceedance probability for each tidal phase. Using the ebb and flood direction calculated earlier we can simply pass our directions, velocities, ebb, and flood direction to createthe following plots.
plot_tidal_phase_probability(data,flood,ebb);
plot_tidal_phase_exceedance(data,flood,ebb);

\ No newline at end of file diff --git a/examples/tidal_example.mlx b/examples/tidal_example.mlx index a211b4c0..9e4bbc7a 100644 Binary files a/examples/tidal_example.mlx and b/examples/tidal_example.mlx differ diff --git a/mhkit/river/resource/exceedance_probability.m b/mhkit/river/resource/exceedance_probability.m index 3634acca..caf01b79 100644 --- a/mhkit/river/resource/exceedance_probability.m +++ b/mhkit/river/resource/exceedance_probability.m @@ -5,81 +5,29 @@ % % Parameters % ---------- -% Q : Discharge data [m3/s] -% -% Pandas dataframe indexed by time [datetime or s] -% -% To make a pandas data frame from user supplied frequency and spectra -% use py.mhkit_python_utils.pandas_dataframe.timeseries_to_pandas(timeseries,time,x) -% -% OR -% -% structure of form: -% -% Q.Discharge -% -% Q.time +% Q : Array +% Discharge data [m3/s] % % Returns % ------- -% F : Structure -% -% -% F.F: Exceedance probability [unitless] -% -% F.time: time [epoch time (s)] -% +% F : Array +% Exceedance probability [unitless] % %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% +% First we make a sorted array of discharge values +sort_q = sort(Q,'descend'); +% Next we make a ranking array +rank = [1:numel(sort_q)]; +% Now each value in Q is assigned a rank where rank is the same as pandas +% ranking algorithm with the 'max' option. For example in the data set +% [1, 2, 2, 3] the ranks would be [ 1, 2, 3, 4] and the rank of 2 would be +% 3 because 3 is the maximum rank that a 2 has in the total set. +assigned_rank = arrayfun(@(x) rank(find(sort_q==x,1,'last')), Q); -py.importlib.import_module('mhkit_python_utils'); -py.importlib.import_module('mhkit'); - -if (isa(Q,'py.pandas.core.frame.DataFrame')~=1) - x=size(Q.Discharge); - li=py.list(); - if x(2)>1 - for i = 1:x(2) - app=py.list(Q.Discharge(:,i)); - li=py.mhkit_python_utils.pandas_dataframe.lis(li,app); - - end - elseif x(2) ==1 - li=Q.Discharge; - end - - if any(isdatetime(Q.time(1))) - si=size(Q.time); - for i=1:si(2) - Q.time(i)=posixtime(Q.time(i)); - end - end - Q=py.mhkit_python_utils.pandas_dataframe.timeseries_to_pandas(li,Q.time,int32(x(2))); -end - -EPpd=py.mhkit.river.resource.exceedance_probability(Q); - -xx=cell(EPpd.axes); -v=xx{2}; -vv=cell(py.list(py.numpy.nditer(v.values,pyargs("flags",{"refs_ok"})))); - -vals=double(py.array.array('d',py.numpy.nditer(EPpd.values))); -sha=cell(EPpd.values.shape); -x=int64(sha{1,1}); -y=int64(sha{1,2}); +F = 100 * (assigned_rank / (numel(Q)+1)); -vals=reshape(vals,[x,y]); -si=size(vals); - for i=1:si(2) - test=string(py.str(vv{i})); - newname=split(test,","); - - F.(newname(1))=vals(:,i); - - end - F.time=double(py.array.array('d',py.numpy.nditer(EPpd.index))); diff --git a/mhkit/tests/River_TestResource.m b/mhkit/tests/River_TestResource.m index 3927962f..c964d47f 100644 --- a/mhkit/tests/River_TestResource.m +++ b/mhkit/tests/River_TestResource.m @@ -11,13 +11,13 @@ function test_Froude_number(testCase) function test_exceedance_probability(testCase) % Create arbitrary discharge between 0 and 8(N=9) - Q = struct('Discharge',[0;1;2;3;4;5;6;7;8],'time',[0 1 2 3 4 5 6 7 8]); + Q = [0;1;2;3;4;5;6;7;8]; % Rank order for non-repeating elements simply adds 1 to each element %if N=9, max F = 100((max(Q)+1)/10) = 90% %if N=9, min F = 100((min(Q)+1)/10) = 10% f = exceedance_probability(Q); - assertEqual(testCase,min(f.F), 10); - assertEqual(testCase,max(f.F), 90); + assertEqual(testCase,min(f), 10); + assertEqual(testCase,max(f), 90); end function test_polynomial_fit(testCase) @@ -103,9 +103,9 @@ function test_plot_flow_duration_curve(testCase) end Q = struct('Discharge',[0;1;2;3;4;5;6;7;8],'time',[0 1 2 3 4 5 6 7 8]); - f = exceedance_probability(Q); + f = exceedance_probability(Q.Discharge); - plot_flow_duration_curve(Q.Discharge, f.F,"savepath",filename); + plot_flow_duration_curve(Q.Discharge, f,"savepath",filename); assertTrue(testCase,isfile(filename)); delete(filename); @@ -118,7 +118,7 @@ function test_plot_power_duration_curve(testCase) end Q = struct('Discharge',[0;1;2;3;4;5;6;7;8],'time',[0 1 2 3 4 5 6 7 8]); - f = exceedance_probability(Q); + f = exceedance_probability(Q.Discharge); poly = polynomial_fit(0:1:8, 10*(0:1:8),1); V = discharge_to_velocity(Q, poly.coef); VV = V.V; @@ -130,7 +130,7 @@ function test_plot_power_duration_curve(testCase) % Power should be 10x greater and exclude the ends of V P = velocity_to_power(V, poly.coef, cut_in, cut_out); - plot_power_duration_curve(P.P, f.F,"savepath",filename); + plot_power_duration_curve(P.P, f,"savepath",filename); assertTrue(testCase,isfile(filename)); delete(filename); @@ -145,9 +145,9 @@ function test_plot_velocity_duration_curve(testCase) Q = struct('Discharge',[0;1;2;3;4;5;6;7;8],'time',[0 1 2 3 4 5 6 7 8]); poly = polynomial_fit(0:1:8, 10*(0:1:8),1); V = discharge_to_velocity(Q, poly.coef); - f = exceedance_probability(Q); + f = exceedance_probability(Q.Discharge); - plot_velocity_duration_curve(V.V, f.F,"savepath",filename); + plot_velocity_duration_curve(V.V, f,"savepath",filename); assertTrue(testCase,isfile(filename)); delete(filename); @@ -190,7 +190,6 @@ function test_plot_velocity_vs_power(testCase) end Q = struct('Discharge',[0;1;2;3;4;5;6;7;8],'time',[0 1 2 3 4 5 6 7 8]); - f = exceedance_probability(Q); poly = polynomial_fit(0:1:8, 10*(0:1:8),1); V = discharge_to_velocity(Q, poly.coef); VV = V.V; diff --git a/mhkit/tests/Tidal_TestResource.m b/mhkit/tests/Tidal_TestResource.m index ddb275ac..e6e7e934 100644 --- a/mhkit/tests/Tidal_TestResource.m +++ b/mhkit/tests/Tidal_TestResource.m @@ -3,10 +3,10 @@ methods (Test) function test_exceedance_probability(testCase) - Q = struct('Discharge',[0;1;2;3;4;5;6;7;8],'time',[1 2 3 4 5 6 7 8 9]); + Q = [0;1;2;3;4;5;6;7;8]; f = exceedance_probability(Q); - assertEqual(testCase,min(f.F), 10); - assertEqual(testCase,max(f.F), 90); + assertEqual(testCase,min(f), 10); + assertEqual(testCase,max(f), 90); end function test_principal_flow_directions(testCase) @@ -88,8 +88,9 @@ function test_plot_phase_probability(testCase) delete(filename); end - file_name = '../../examples/data/tidal/s08010.json'; - data = read_noaa_json(file_name); + relative_file_name = '../../examples/data/tidal/s08010.json'; + full_file_name = fullfile(fileparts(mfilename('fullpath')), relative_file_name); + data = read_noaa_json(full_file_name); data.s = data.s/100; width_direction = 1; [direction1, direction2] = ... @@ -109,10 +110,10 @@ function test_plot_phase_exceedance(testCase) delete(filename); end - file_name = '../../examples/data/tidal/s08010.json'; - data = read_noaa_json(file_name); + relative_file_name = '../../examples/data/tidal/s08010.json'; + full_file_name = fullfile(fileparts(mfilename('fullpath')), relative_file_name); + data = read_noaa_json(full_file_name); data.s = data.s/100; - data.Discharge = data.s; width_direction = 1; [direction1, direction2] = ... principal_flow_directions(data.d,width_direction); diff --git a/mhkit/tests/Wave_TestIO.m b/mhkit/tests/Wave_TestIO.m index 2b4f8e4f..043e9433 100644 --- a/mhkit/tests/Wave_TestIO.m +++ b/mhkit/tests/Wave_TestIO.m @@ -110,7 +110,7 @@ function test_swan_read_block(testCase) % WPTO multiple locations function test_WPTO_point_multiloc(testCase) - + if ispc api_key = '3K3JQbjZmWctY0xmIfSYvYgtIcM3CN0cb1Y2w9bf'; hindcast_data = request_wpto('1-hour',... ["energy_period"],[44.624076,-124.280097;43.489171,-125.152137],... @@ -140,11 +140,13 @@ function test_WPTO_point_multiloc(testCase) assertEqual(testCase,expected_meta.timezone(2),hindcast_data(2).metadata.timezone); assertEqual(testCase,expected_meta.jurisdiction{2},hindcast_data(2).metadata.jurisdiction); assertEqual(testCase,expected_meta.distance_to_shore(2),hindcast_data(2).metadata.distance_to_shore,'RelTol',0.000001); - + else + + end end function test_WPTO_point_multiparm(testCase) - + if ismac api_key = '3K3JQbjZmWctY0xmIfSYvYgtIcM3CN0cb1Y2w9bf'; hindcast_data = request_wpto('3-hour',... ["mean_absolute_period","significant_wave_height"],[44.624076,-124.280097],... @@ -166,6 +168,10 @@ function test_WPTO_point_multiparm(testCase) assertEqual(testCase,expected_meta.timezone(1),hindcast_data.metadata.timezone); assertEqual(testCase,expected_meta.jurisdiction{1},hindcast_data.metadata.jurisdiction); assertEqual(testCase,expected_meta.distance_to_shore(1),hindcast_data.metadata.distance_to_shore,'RelTol',0.000001); + + else + + end end end end \ No newline at end of file diff --git a/mhkit/tests/test_results.pdf b/mhkit/tests/test_results.pdf new file mode 100644 index 00000000..72cfc021 Binary files /dev/null and b/mhkit/tests/test_results.pdf differ diff --git a/mhkit/tidal/graphics/plot_tidal_phase_exceedance.m b/mhkit/tidal/graphics/plot_tidal_phase_exceedance.m index b5dfc628..7e0db780 100644 --- a/mhkit/tidal/graphics/plot_tidal_phase_exceedance.m +++ b/mhkit/tidal/graphics/plot_tidal_phase_exceedance.m @@ -80,14 +80,11 @@ isEbb = ~((data.d < upper_split) & (data.d >= lower_split)); end -s_ebb = struct('time', {data.time(isEbb)},... - 'Discharge', {data.Discharge(isEbb)}); -s_flood = struct('time', {data.time(~isEbb)},... - 'Discharge', {data.Discharge(~isEbb)}); +s_ebb = data.s(isEbb); +s_flood = data.s(~isEbb); -F = exceedance_probability(data).F; -F_ebb = exceedance_probability(s_ebb).F; -F_flood = exceedance_probability(s_flood).F; +F_ebb = exceedance_probability(s_ebb); +F_flood = exceedance_probability(s_flood); decimals = round(options.bin_size/0.1); s_new = [round(min(data.s),decimals):options.bin_size:... @@ -95,19 +92,15 @@ s_new = s_new(1:end-1); % this accounts for the difference between matlab % and numpy.arange which uses a half open interval -py.importlib.import_module('scipy'); -kwa = pyargs('bounds_error',false); +[~,i_ebb] = unique(s_ebb); +[~,i_flood] = unique(s_flood); -f_ebb = py.scipy.interpolate.interp1d(s_ebb.Discharge,... +f_ebb = py.scipy.interpolate.interpolate.interp1d(s_ebb.Discharge,... F_ebb, kwa); -f_flood = py.scipy.interpolate.interp1d(s_flood.Discharge,... +f_flood = py.scipy.interpolate.interpolate.interp1d(s_flood.Discharge,... F_flood, kwa); -F_ebb = double(f_ebb(s_new)); -F_flood = double(f_flood(s_new)); - -py.importlib.import_module('numpy'); -F_max_total = py.numpy.nanmax(F_ebb) + py.numpy.nanmax(F_flood); +F_max_total = max(F_ebb(~isnan(F_ebb))) + max(F_flood(~isnan(F_flood))); s_plot = repmat(s_new,2,1); figure = area(s_plot.', [F_ebb/F_max_total*100; F_flood/F_max_total*100].'); @@ -125,4 +118,4 @@ exportgraphics(gca, options.savepath); end -hold off +hold off \ No newline at end of file diff --git a/mhkit/tidal/io/read_noaa_json.m b/mhkit/tidal/io/read_noaa_json.m index 219649f3..ee2af828 100644 --- a/mhkit/tidal/io/read_noaa_json.m +++ b/mhkit/tidal/io/read_noaa_json.m @@ -19,42 +19,46 @@ % %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% -py.importlib.import_module('mhkit'); -py.importlib.import_module('numpy'); +fid = fopen(filename); % Open the file +raw = fread(fid,inf); % Read the contents +str = char(raw'); % Transformation +fclose(fid); % Close the file +init_data = jsondecode(str); -datap = py.mhkit.tidal.io.noaa.read_noaa_json(filename); - -datac=cell(datap); -data=struct(datac{2}); -data_df=datac{1}; - -fields=fieldnames(data); +data = struct(); +% Move metadata from its own field to the main structure +fields = fieldnames(init_data.metadata); for idx = 1:length(fields) - data.(fields{idx}) = string(data.(fields{idx})); + data.(fields{idx}) = convertCharsToStrings(... + init_data.metadata.(fields{idx})); end +init_data = rmfield(init_data,'metadata'); -xx=cell(data_df.axes); -v=xx{2}; - - -vv=cell(py.list(py.numpy.nditer(v.values,pyargs("flags",{"refs_ok"})))); +% Move the data fields over and create time +fields = fieldnames(init_data); +shape = size(fieldnames(init_data.(fields{1}))); +for idx = 1:length(fields) + data.(fields{idx}) = zeros(shape); +end -vals=double(py.array.array('d',py.numpy.nditer(data_df.values,pyargs("flags",{"refs_ok"})))); -sha=cell(data_df.values.shape); -x=int64(sha{1,1}); -y=int64(sha{1,2}); +% loop through the time (which gets read in as the field names for each +% data column) and fill in data -vals=reshape(vals,[x,y]); -si=size(vals); +time = fieldnames(init_data.(fields{1})); +data.time = cellfun(@(x) x(2:end), time, 'UniformOutput', false); +data.time = cellfun(@str2double,data.time); +data.time = transpose(data.time/1000.); +% Time is in epochtime since 1970-01-01. Divide by 1000 to remove +% milliseconds. convert with +% datetime(time,'ConvertFrom','epochtime','Epoch','1970-01-01') -for i=1:si(2) - test=string(py.str(vv{i})); - newname=split(test,","); - data.(newname(1))=vals(:,i); +for idx = 1:length(fields) + test = init_data.(fields{idx}).(time{1}); + data.(fields{idx}) = cell2mat(struct2cell(init_data.(fields{idx}))); + if ischar(test) || isstring(test) + data.(fields{idx}) = arrayfun(@str2double,data.(fields{idx})); + end end - -times = double( ... - py.mhkit_python_utils.pandas_dataframe.datetime_index_to_ordinal(data_df)); -data.time = posixtime(datetime(times, 'ConvertFrom', 'datenum')); +end \ No newline at end of file diff --git a/mhkit/tidal/io/request_noaa_data.m b/mhkit/tidal/io/request_noaa_data.m index 4afe40d6..f5140b41 100644 --- a/mhkit/tidal/io/request_noaa_data.m +++ b/mhkit/tidal/io/request_noaa_data.m @@ -90,146 +90,3 @@ elseif is_first_query data = data_in_period; is_first_query = false; - - else - % Append to existing data structure - for i = 1:length(timeseries_fields) - data.(timeseries_fields{i}) = cat( ... - 1, ... % by rows - data.(timeseries_fields{i}), ... - data_in_period.(timeseries_fields{i})); - end - end - - - start_period_datetime = end_period_datetime + days(1); -end - -% Write NOAA data to a JSON file -if options.write_json ~= "" - fid = fopen(options.write_json, 'w'); - fprintf(fid, jsonencode(data)); -end - -end - -function [data, timeseries_fields] = request_noaa_data_restricted_duration( ... - station, parameter, start_date, end_date, ... - MAX_RETRIES, REQUIRED_FIELDS, MAX_DAYS_PER_QUERY) - % Verify dates are less than MAX_DAYS_PER_QUERY apart - start_datetime = datetime(start_date, ... - 'InputFormat', 'yyyyMMdd', ... - 'TimeZone', 'UTC'); - end_datetime = datetime(end_date, ... - 'InputFormat', 'yyyyMMdd', ... - 'TimeZone', 'UTC'); - msg = 'Date range is greater than %d days in request_noaa_data_restricted_duration()'; - assert(days(end_datetime - start_datetime) + 1 <= MAX_DAYS_PER_QUERY, ... - sprintf(msg, MAX_DAYS_PER_QUERY)); - - % Formulate query - start_date = convertCharsToStrings(start_date); - start_date = convertCharsToStrings(start_date); - start_date = convertCharsToStrings(start_date); - end_date = convertCharsToStrings(end_date); - data_url = "https://tidesandcurrents.noaa.gov/api/datagetter?"; - api_query = "begin_date=" + start_date ... - + "&end_date=" + end_date ... - + "&station=" + station ... - + "&product=" + parameter ... - + "&units=metric&" ... - + "time_zone=gmt&" ... - + "application=web_services&" ... - + "format=xml"; - - % Display query - disp("Data request URL: " + data_url + api_query) - - % Submit query and get data - for i = 0:MAX_RETRIES - try - response = webread(data_url + api_query); - break; - catch ME - if i == MAX_RETRIES - disp(['MATLAB:request_noaa_data: ', ME.identifier]); - rethrow(ME) - else - pause(1); % pause(seconds) and retry query - end - end - end - - % Organize a structure containing the metadata and timeseries data - is_metadata_found = false; - is_timeseries_keys_found = false; - data_lines = strsplit(response, '\n'); - for i = 1:length(data_lines) - if startsWith(data_lines{i}, '= floodEbbNormalDegree1... + & directions(:,1) <= floodEbbNormalDegree2); +d2 = directions(directions(:,1) < floodEbbNormalDegree1... + | directions(:,1) > floodEbbNormalDegree2); +% Shift second set of of directions to not break between 360 and 0 +d2 = d2 - 180.; +% Renormalize the points (gets rid of negatives) +d2 = arrayfun(@normalize_angle,d2); +% Number of bins for semi-circl +n_dir = int32(floor(180/width_dir)); +% Compute 1D histograms on both semi circles +dir1_edges = [min(d1):(max(d1)-min(d1))/(double(n_dir)):max(d1)]; +Hd1 = histcounts(d1, dir1_edges); +Hd1 = pdf(Hd1, dir1_edges); +dir2_edges = [min(d2):(max(d2)-min(d2))/(double(n_dir)):max(d2)]; +Hd2 = histcounts(d2, dir2_edges); +Hd2 = pdf(Hd2, dir2_edges); +%Convert to perecnt +Hd1 = Hd1 * 100; % [%] +Hd2 = Hd2 * 100; % [%] +% Principal Directions average of the 2 bins +max1 = find(Hd1 == max(Hd1)); +max2 = find(Hd2 == max(Hd2)); +ebb = 0.5 * (dir1_edges(max1) +... + dir1_edges(max1 + 1)); % PrincipalDirection1 in python +flood = 0.5 * (dir2_edges(max2) +... + dir2_edges(max2 + 1)) + 180.; % PrincipalDirection2 in python + + + function new_degree = normalize_angle(degree) + % Normalizes degrees to be between 0 and 360 + % + % Parameters + % ---------- + % degree: int or float + % + % Returns + % ------- + % new_degree: float + % Normalized between 0 and 360 degrees + + % Set new degree as remainder + new_degree = rem(degree, 360); + % Ensure Positive + new_degree = rem((new_degree + 360),360); + end + + function out = pdf(hist, edges) + % Probability density function: + % Given a histrogram, the result is the value of the + % probability density function at the bin, normalized such that + % the integral over the range is 1. Note that the sum of the + % histogram values will not be equal to 1 unless bins of unity + % width are chosen; it is not a probability *mass* function. + % + % Parameters + % ---------- + % hist: array (histogram count) + % edges: array (edges of histogram) + % + % Returns + % ------- + % out: array + % probability density of histogram + + db = diff(edges); + out = hist./db./sum(hist); + end + +end diff --git a/mhkit/wave/resource/cdip_netcdf.nc b/mhkit/wave/resource/cdip_netcdf.nc new file mode 100644 index 00000000..e69de29b diff --git a/mhkit_0.3.prj b/mhkit_0.3.prj new file mode 100644 index 00000000..c5c0f9f0 --- /dev/null +++ b/mhkit_0.3.prj @@ -0,0 +1,155 @@ + + + mhkit + + + NREL, PNNL, Sandia + Toolbox for processing data relevent to Marine Hydrokinetic energy converters. + + + 0.3.2 + ${PROJECT_ROOT}/mhkit.mltbx + + + + + a2dc6eda-931e-408a-8e3c-8f63d1f9da82 + % List files contained in your toolbox folder that you would like to exclude +% from packaging. Excludes should be listed relative to the toolbox folder. +% Some examples of how to specify excludes are provided below: +% +% A single file in the toolbox folder: +% .svn +% +% A single file in a subfolder of the toolbox folder: +% example/.svn +% +% All files in a subfolder of the toolbox folder: +% example/* +% +% All files of a certain name in all subfolders of the toolbox folder: +% **/.svn +% +% All files matching a pattern in all subfolders of the toolbox folder: +% **/*.bak +% +.DS_Store +python-utils/* +.git +.gitignore +setup.py +tests/* +General/* +General +tests +python-utils +*.json +*.JSON + true + + + + + + + + /General + /python-utils + /tests + /tidal + /tests/__pycache__ + /tests/data + /wave/resource/__pycache__ + + + false + + + + R2019b + latest + false + true + true + true + true + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + ${PROJECT_ROOT}/mhkit + + + ${PROJECT_ROOT}/mhkit/loads + ${PROJECT_ROOT}/mhkit/power + ${PROJECT_ROOT}/mhkit/qc + ${PROJECT_ROOT}/mhkit/river + ${PROJECT_ROOT}/mhkit/tidal + ${PROJECT_ROOT}/mhkit/utils + ${PROJECT_ROOT}/mhkit/wave + + + + + + /Users/rpauly/Documents/MHKit/MHKiT/mhkit-matlab-1/mhkit.mltbx + + + + /Applications/MATLAB_R2020a.app + + + + true + true + false + false + false + false + false + false + 10.15.7 + false + true + maci64 + true + + + \ No newline at end of file