Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lint: Utils module #350

Open
wants to merge 37 commits into
base: develop
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
37 commits
Select commit Hold shift + click to select a range
3b5578b
matplotlib >=3.8
ssolson Aug 9, 2024
6c62bfd
Merge branch 'develop' of https://github.com/MHKiT-Software/MHKiT-Pyt…
ssolson Aug 19, 2024
738e3d0
Merge branch 'develop' of https://github.com/MHKiT-Software/MHKiT-Pyt…
ssolson Aug 19, 2024
f91ef77
Merge branch 'develop' of https://github.com/MHKiT-Software/MHKiT-Pyt…
ssolson Aug 28, 2024
8b8d54b
lint utils
ssolson Sep 4, 2024
e1196e1
10 lint coverage
ssolson Sep 4, 2024
f4ec009
reduce handle_caching to 5 inputs
ssolson Sep 4, 2024
88810cf
index is now "t"
ssolson Sep 5, 2024
71974b3
10/10 lint
ssolson Sep 5, 2024
8d0263f
10/10 lint
ssolson Sep 5, 2024
769a26f
add test__calculate_statistics
ssolson Sep 5, 2024
2f2655b
10/10 lint
ssolson Sep 5, 2024
e6da2ed
10/10 pylint
ssolson Sep 5, 2024
1c3602a
handle cache returns None now
ssolson Sep 6, 2024
0e343d5
fix logic around None passed to handle_cache
ssolson Sep 6, 2024
486708d
back to index
ssolson Sep 6, 2024
bdf74b3
data no longer returned as list
ssolson Sep 6, 2024
dfad8dc
remove old cache_utils function
ssolson Sep 6, 2024
8d551e4
clean up
ssolson Sep 6, 2024
cf54e12
type hints
ssolson Sep 9, 2024
7720670
Merge branch 'develop' of https://github.com/MHKiT-Software/MHKiT-Pyt…
ssolson Sep 11, 2024
92ad905
Merge branch 'develop' of https://github.com/MHKiT-Software/MHKiT-Pyt…
ssolson Sep 11, 2024
65ad2e4
Merge branch 'develop' of https://github.com/MHKiT-Software/MHKiT-Pyt…
ssolson Nov 12, 2024
44416f5
fix pylint iussues
ssolson Nov 12, 2024
ef04cc2
clean up package installation
ssolson Nov 13, 2024
7064645
change env name to mhkit-env
ssolson Nov 13, 2024
7aaedea
clean up installation
ssolson Nov 13, 2024
9e0d63d
add cf-staging label
ssolson Nov 13, 2024
03c9552
Use conda env file in all tests
ssolson Nov 13, 2024
ddfd14f
add configs and debug
ssolson Nov 15, 2024
9f5e427
use legacy solver
ssolson Nov 15, 2024
e5f1b5c
Ensure compatibility with modern packages
ssolson Nov 15, 2024
fd9646b
Ensure compatibility with modern packages
ssolson Nov 15, 2024
b721f03
add pecos
ssolson Nov 15, 2024
e7fc584
netcdf4 from pip to conda
ssolson Nov 15, 2024
24fa8a5
py 3.11, relax hdf5& netCDF4
ssolson Nov 15, 2024
259e7e5
relax python constraints
ssolson Nov 15, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
141 changes: 97 additions & 44 deletions .github/workflows/main.yml
Original file line number Diff line number Diff line change
Expand Up @@ -65,23 +65,39 @@ jobs:
auto-update-conda: true
python-version: ${{ env.PYTHON_VER }}
activate-environment: TESTconda
use-only-tar-bz2: true
use-only-tar-bz2: false

- name: Use legacy solver
run: conda config --set solver classic

- name: Configure conda channels
run: |
conda config --add channels conda-forge/label/cf-staging
conda config --add channels conda-forge
conda config --add channels defaults

- name: Setup Conda environment
- name: Create MHKiT Conda environment
shell: bash -l {0}
run: |
conda install numpy cython pip hdf5 libnetcdf cftime netcdf4 --strict-channel-priority
pip install -e . --force-reinstall
conda env create -f environment.yml --debug
conda activate mhkit-env

- name: Install dependencies
- name: Install testing dependencies
shell: bash -l {0}
run: |
python -m pip install --upgrade pip wheel
pip install coverage pytest coveralls .
conda activate mhkit-env
conda install -y pytest coverage coveralls

- name: Install mhkit
shell: bash -l {0}
run: |
conda activate mhkit-env
pip install -e . --no-deps

- name: Prepare non-hindcast API data
shell: bash -l {0}
run: |
conda activate mhkit-env
pytest mhkit/tests/river/test_io_usgs.py
pytest mhkit/tests/tidal/test_io.py
pytest mhkit/tests/wave/io/test_cdip.py
Expand Down Expand Up @@ -109,23 +125,30 @@ jobs:
auto-update-conda: true
activate-environment: TEST
python-version: ${{ env.PYTHON_VER }}
use-only-tar-bz2: true
use-only-tar-bz2: false

- name: Setup Conda environment
- name: Create MHKiT Conda environment
shell: bash -l {0}
run: |
conda install numpy cython pip pytest hdf5 libnetcdf cftime netcdf4 coverage --strict-channel-priority
pip install -e . --force-reinstall
conda env create -f environment.yml
conda activate mhkit-env

- name: Install dependencies
- name: Install testing dependencies
shell: bash -l {0}
run: |
python -m pip install --upgrade pip wheel
pip install coverage pytest coveralls .
conda activate mhkit-env
conda install -y pytest coverage coveralls

- name: Install mhkit
shell: bash -l {0}
run: |
conda activate mhkit-env
pip install -e . --no-deps

- name: Prepare Wave Hindcast data
shell: bash -l {0}
run: |
conda activate mhkit-env
pytest mhkit/tests/wave/io/hindcast/test_hindcast.py

- name: Upload Wave Hindcast data as artifact
Expand All @@ -151,23 +174,30 @@ jobs:
auto-update-conda: true
activate-environment: TEST
python-version: ${{ env.PYTHON_VER }}
use-only-tar-bz2: true
use-only-tar-bz2: false

- name: Setup Conda environment
- name: Create MHKiT Conda environment
shell: bash -l {0}
run: |
conda install numpy cython pip pytest hdf5 libnetcdf cftime netcdf4 coverage --strict-channel-priority
pip install -e . --no-deps --force-reinstall
conda env create -f environment.yml
conda activate mhkit-env

- name: Install dependencies
- name: Install testing dependencies
shell: bash -l {0}
run: |
python -m pip install --upgrade pip wheel
pip install coverage pytest coveralls .
conda activate mhkit-env
conda install -y pytest coverage coveralls

- name: Install mhkit
shell: bash -l {0}
run: |
conda activate mhkit-env
pip install -e . --no-deps

- name: Prepare Wind Hindcast data
shell: bash -l {0}
run: |
conda activate mhkit-env
pytest mhkit/tests/wave/io/hindcast/test_wind_toolkit.py

- name: Upload Wind Hindcast data as artifact
Expand Down Expand Up @@ -201,21 +231,28 @@ jobs:
python-version: ${{ matrix.python-version }}
use-only-tar-bz2: false

- name: Create and setup Conda environment
- name: Create MHKiT Conda environment
shell: bash -l {0}
run: |
conda install -c conda-forge pytest coverage=7.5.0 coveralls --strict-channel-priority
pip install -e . --force-reinstall
conda env create -f environment.yml
conda activate mhkit-env

- name: Download data from artifact
uses: actions/download-artifact@v4
with:
name: data
path: ~/.cache/mhkit
- name: Install testing dependencies
shell: bash -l {0}
run: |
conda activate mhkit-env
conda install -y pytest coverage coveralls

- name: Install mhkit
shell: bash -l {0}
run: |
conda activate mhkit-env
pip install -e . --no-deps

- name: Run pytest & generate coverage report
shell: bash -l {0}
run: |
conda activate mhkit-env
coverage run --rcfile=.github/workflows/.coveragerc --source=./mhkit/ -m pytest -c .github/workflows/pytest.ini
coverage lcov

Expand Down Expand Up @@ -310,11 +347,23 @@ jobs:
python-version: ${{ matrix.python-version }}
use-only-tar-bz2: false

- name: Setup Conda environment
- name: Create MHKiT Conda environment
shell: bash -l {0}
run: |
conda install -c conda-forge pytest coverage=7.5.0 coveralls --strict-channel-priority
pip install -e . --force-reinstall
conda env create -f environment.yml
conda activate mhkit-env

- name: Install testing dependencies
shell: bash -l {0}
run: |
conda activate mhkit-env
conda install -y pytest coverage coveralls

- name: Install mhkit
shell: bash -l {0}
run: |
conda activate mhkit-env
pip install -e . --no-deps

- name: Download Wave Hindcast data from artifact
uses: actions/download-artifact@v4
Expand All @@ -335,9 +384,10 @@ jobs:
mv ~/.cache/mhkit/wind-hindcast/hindcast/* ~/.cache/mhkit/hindcast/
shell: bash

- name: Install MHKiT and run pytest
- name: Run hindcast pytest
shell: bash -l {0}
run: |
conda activate mhkit-env
coverage run --rcfile=.github/workflows/.coveragehindcastrc -m pytest -c .github/workflows/pytest-hindcast.ini
coverage lcov

Expand Down Expand Up @@ -416,23 +466,25 @@ jobs:
auto-update-conda: true
python-version: '3.11'
activate-environment: TESTconda
use-only-tar-bz2: true
use-only-tar-bz2: false

- name: Install dependencies
- name: Create MHKiT Conda environment
shell: bash -l {0}
run: |
conda install numpy cython pip hdf5 libnetcdf cftime netcdf4 --strict-channel-priority
pip install -e . --force-reinstall
python -m pip install --upgrade pip wheel
pip install nbval jupyter
pip install utm folium
conda env create -f environment.yml
conda activate mhkit-env

- name: Install notebook testing dependencies
shell: bash -l {0}
run: |
conda activate mhkit-env
conda install -y pytest coverage coveralls nbval jupyter utm folium

- name: Ensure Conda environment is activated
- name: Install mhkit
shell: bash -l {0}
run: |
echo "source ~/miniconda3/etc/profile.d/conda.sh" >> ~/.bashrc
echo "conda activate TESTconda" >> ~/.bashrc
source ~/.bashrc
conda activate mhkit-env
pip install -e . --no-deps

- name: Download non-hindcast data
uses: actions/download-artifact@v4
Expand Down Expand Up @@ -470,6 +522,7 @@ jobs:
- name: Run notebook
shell: bash -l {0}
run: |
conda activate mhkit-env
if [[ "${{ matrix.notebook }}" == "examples/metocean_example.ipynb" || "${{ matrix.notebook }}" == "examples/WPTO_hindcast_example.ipynb" ]]; then
if [[ "${{ needs.check-changes.outputs.should-run-hindcast }}" == 'true' ]]; then
jupyter nbconvert --to notebook --execute --inplace --ExecutePreprocessor.timeout=${{ matrix.timeout }} "${{ matrix.notebook }}"
Expand Down
4 changes: 4 additions & 0 deletions .github/workflows/pylint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@ jobs:
run: |
pylint mhkit/power/

- name: Run Pylint on mhkit/utils/
run: |
pylint mhkit/utils/
Comment on lines +32 to +34
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add utils module to the linting enforcement


- name: Run Pylint on mhkit/acoustics/
run: |
pylint mhkit/acoustics/
7 changes: 4 additions & 3 deletions environment.yml
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
name: myenv
name: mhkit-env
channels:
- conda-forge
- defaults
Expand All @@ -12,16 +12,17 @@ dependencies:
- scikit-learn>=1.5.1
- h5py>=3.11.0
- h5pyd>=0.18.0
- netCDF4>=1.6.5
- hdf5>=1.14.3,<1.14.5.0a0
- statsmodels>=0.14.2
- requests
- beautifulsoup4
- numexpr>=2.10.0
- lxml
- bottleneck
- pecos>=0.3.0
- pip:
- netCDF4>=1.7.1.post1
- matplotlib>=3.9.1
- pecos>=0.3.0
- fatpack
- NREL-rex>=0.2.63
- notebook
12 changes: 10 additions & 2 deletions mhkit/river/io/usgs.py
Original file line number Diff line number Diff line change
Expand Up @@ -120,7 +120,10 @@ def request_usgs_data(

# Use handle_caching to manage cache
cached_data, metadata, cache_filepath = handle_caching(
hash_params, cache_dir, write_json, clear_cache
hash_params,
cache_dir,
cache_content={"data": None, "metadata": None, "write_json": write_json},
clear_cache_file=clear_cache,
Comment on lines +123 to +126
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

)

if cached_data is not None:
Expand Down Expand Up @@ -165,7 +168,12 @@ def request_usgs_data(

# After making the API request and processing the response, write the
# response to a cache file
handle_caching(hash_params, cache_dir, data=data, clear_cache_file=clear_cache)
handle_caching(
hash_params,
cache_dir,
cache_content={"data": data, "metadata": None, "write_json": None},
clear_cache_file=clear_cache,
)
Comment on lines +171 to +176
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


if write_json:
shutil.copy(cache_filepath, write_json)
Expand Down
26 changes: 22 additions & 4 deletions mhkit/tests/utils/test_cache.py
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,11 @@ def test_handle_caching_creates_cache(self):
Asserts:
- The cache file is successfully created at the expected file path.
"""
handle_caching(self.hash_params, self.cache_dir, data=self.data)
handle_caching(
self.hash_params,
self.cache_dir,
cache_content={"data": self.data, "metadata": None, "write_json": None},
)
Comment on lines +96 to +100
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


cache_filename = (
hashlib.md5(self.hash_params.encode("utf-8")).hexdigest() + ".json"
Expand All @@ -114,8 +118,18 @@ def test_handle_caching_retrieves_data(self):
Asserts:
- The retrieved data matches the original sample DataFrame.
"""
handle_caching(self.hash_params, self.cache_dir, data=self.data)
retrieved_data, _, _ = handle_caching(self.hash_params, self.cache_dir)
handle_caching(
self.hash_params,
self.cache_dir,
cache_content={"data": self.data, "metadata": None, "write_json": None},
)

retrieved_data, _, _ = handle_caching(
self.hash_params,
self.cache_dir,
cache_content={"data": None, "metadata": None, "write_json": None},
)

Comment on lines +121 to +132
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pd.testing.assert_frame_equal(self.data, retrieved_data, check_freq=False)

def test_handle_caching_cdip_file_extension(self):
Expand All @@ -132,7 +146,11 @@ def test_handle_caching_cdip_file_extension(self):
- The cache file with a ".pkl" extension is successfully created at the expected file path.
"""
cache_dir = os.path.join(self.cache_dir, "cdip")
handle_caching(self.hash_params, cache_dir, data=self.data)
handle_caching(
self.hash_params,
cache_dir,
cache_content={"data": self.data, "metadata": None, "write_json": None},
)
Comment on lines +149 to +153
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.


cache_filename = (
hashlib.md5(self.hash_params.encode("utf-8")).hexdigest() + ".pkl"
Expand Down
Loading
Loading