Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[POWEROPS-2558] chore: linters and codecov in workflows #406

Merged
merged 7 commits into from
Aug 8, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 17 additions & 5 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,19 @@ on:
pull_request:
branches: [main]

env:
PYTHON_VERSION: "3.9"

jobs:
lint:
runs-on: ubuntu-latest
environment: CI
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.9"
python-version: ${{ env.PYTHON_VERSION }}

# The installation is required as mypy must be run in the local system environment, not in the pre-commit environment.
- name: Install required dependencies
Expand All @@ -27,6 +31,7 @@ jobs:

tests:
runs-on: ubuntu-latest
environment: CI
strategy:
fail-fast: false
matrix:
Expand All @@ -50,26 +55,33 @@ jobs:

- name: Test
env:
CI: 1
run: pytest --skipcdf
SETTINGS__COGNITE__CLIENT_SECRET: ${{ secrets.CLIENT_SECRET }}
SETTINGS__COGNITE__CLIENT_ID: ${{ vars.CLIENT_ID }}
SETTINGS__COGNITE__PROJECT: ${{ vars.PROJECT }}
SETTINGS__COGNITE__CDF_CLUSTER: ${{ vars.CLUSTER }}
SETTINGS__COGNITE__TENANT_ID: ${{ vars.TENANT_ID }}
Comment on lines +59 to +62
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

question: Is this pointing to environment variables in the CI environment? I don't see them there.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

They're on the repo level because all the envs need them

SETTINGS__COGNITE__LOGIN_FLOW: "client_credentials"
run: pytest --cov --cov-report term --cov-report xml:coverage.xml

- uses: codecov/codecov-action@v4
if: matrix.python-version == ${{ env.PYTHON_VERSION }}
with:
file: ./coverage.xml
token: ${{ secrets.CODECOV_TOKEN }}

- name: Build package
run: poetry build

build-docs:
runs-on: ubuntu-latest
environment: CI

steps:
- uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: 3.9
python-version: ${{ env.PYTHON_VERSION }}

- name: Install Dependencies
run: |
Expand Down
31 changes: 8 additions & 23 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -8,26 +8,6 @@ env:
PYTHON_VERSION: "3.9"

jobs:
lint:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.9"

# The installation is required as mypy must be run in the local system environment, not in the pre-commit environment.
- name: Install required dependencies
run: |
python3 -m pip install --upgrade pip poetry
poetry config virtualenvs.create false
poetry install

- name: Linting and static code checks
run: |
pre-commit run --all-files

Comment on lines -11 to -30
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

question: Not needed here because we do it in build.yml?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, no need to re-run pre-commit on merge to main given the branch protections and strict requirements for passing on PRs toward main

test_and_release:
runs-on: ubuntu-latest
environment: CD
Expand All @@ -45,12 +25,17 @@ jobs:

- name: Test
env:
CI: 1
run: pytest --skipcdf
SETTINGS__COGNITE__CLIENT_SECRET: ${{ secrets.CLIENT_SECRET }}
SETTINGS__COGNITE__CLIENT_ID: ${{ vars.CLIENT_ID }}
SETTINGS__COGNITE__PROJECT: ${{ vars.PROJECT }}
SETTINGS__COGNITE__CDF_CLUSTER: ${{ vars.CLUSTER }}
SETTINGS__COGNITE__TENANT_ID: ${{ vars.TENANT_ID }}
SETTINGS__COGNITE__LOGIN_FLOW: "client_credentials"
run: pytest --cov --cov-report term --cov-report xml:coverage.xml

- uses: codecov/codecov-action@v4
with:
file: ./coverage.xml
token: ${{ secrets.CODECOV_TOKEN }}

- name: Build package
run: poetry build
Expand Down
5 changes: 3 additions & 2 deletions .github/workflows/resync-apply.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -12,14 +12,15 @@ on:
# - "tests/data/demo/v1/**.yml"

env:
PYTHON_VERSION: "3.11"
CONFIGURATION_FILE: "tests/data/demo/v1/resync_configuration.yaml"

jobs:
resync-apply:
name: Resync Apply "v1" to CDF project ${{ vars.PROJECT }}

environment:
name: dev
name: CD

runs-on: ubuntu-latest

Expand All @@ -28,7 +29,7 @@ jobs:

- uses: actions/setup-python@v5
with:
python-version: 3.11
python-version: ${{ env.PYTHON_VERSION }}

- name: Install dependencies
run: |
Expand Down
5 changes: 3 additions & 2 deletions .github/workflows/resync-plan.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -11,14 +11,15 @@ on:
- "tests/data/demo/v1/**.yml"

env:
PYTHON_VERSION: "3.11"
CONFIGURATION_FILE: "tests/data/demo/v1/resync_configuration.yaml"

jobs:
resync-plan:
name: Resync Plan "v1" to CDF project ${{ vars.PROJECT }}

environment:
name: dev
name: CI

runs-on: ubuntu-latest

Expand All @@ -27,7 +28,7 @@ jobs:

- uses: actions/setup-python@v5
with:
python-version: 3.11
python-version: ${{ env.PYTHON_VERSION }}

- name: Install dependencies
run: |
Expand Down
15 changes: 4 additions & 11 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -9,18 +9,16 @@ repos:
hooks:
- id: codespell

- repo: https://github.com/psf/black
rev: 24.4.2
hooks:
- id: black

- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: "v0.5.6"
hooks:
- id: ruff
args:
- --exit-non-zero-on-fix
- --config=pyproject.toml
- id: ruff-format
args:
- --config=pyproject.toml

- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.6.0
Expand All @@ -36,19 +34,14 @@ repos:
args:
- --pytest-test-first
exclude: "constants.py|mock_.*.py"
# - id: no-commit-to-branch # Disabled because these hooks are also run on PR merge to main
- id: trailing-whitespace
- id: no-commit-to-branch

- repo: https://github.com/google/yamlfmt
rev: v0.13.0
hooks:
- id: yamlfmt

- repo: https://github.com/frnhr/pre-commit-nocommit
rev: 0.0.1
hooks:
- id: check-nocommit

# Mypy must be run in the local system environment, not in the pre-commit environment.
- repo: local
hooks:
Expand Down
1 change: 1 addition & 0 deletions cognite/powerops/client/_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,7 @@
2. Use vanilla Python logging:
```
import logging

logger = logging.getLogger(__name__)
```

Expand Down
1 change: 0 additions & 1 deletion cognite/powerops/client/shop/api/shop_run_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -145,7 +145,6 @@ def _shop_url(self) -> str:
return f"https://power-ops-api{stage}.{cluster}.cognite.ai/{project}/run-shop"

def _shop_url_shaas(self) -> str:

project = self._client.config.project

cluster = urlparse(self._client.config.base_url).netloc.split(".", 1)[0]
Expand Down
1 change: 0 additions & 1 deletion cognite/powerops/client/shop/shop_run_api.py
Original file line number Diff line number Diff line change
Expand Up @@ -175,7 +175,6 @@ def _shop_url(self) -> str:
return f"https://power-ops-api{stage}.{cluster}.cognite.ai/{project}/run-shop"

def _shop_url_shaas(self) -> str:

project = self._cdf.config.project

cluster = urlparse(self._cdf.config.base_url).netloc.split(".", 1)[0]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -513,6 +513,7 @@ def pre_apply(self, client: CogniteClient, shop_model: dict, start: int, end: in
Example:
```python
from cognite.client import CogniteClient

start_time = datetime.timestamp(datetime(2000, 1, 1, 12))
end_time = datetime.timestamp(datetime(2000, 1, 10, 12))
model = {"reservoir": {"Lundevatn": {"vol_head": {"x": [10, 20, 40, 80, 160], "y": [2, 4, 6, 8, 10]}}}}
Expand Down Expand Up @@ -842,7 +843,6 @@ def pre_apply(self, client: CogniteClient, shop_model: dict, start: int, end: in
def add_water_in_transit(
inflow: pd.Series, discharge: pd.Series, shape: dict[int, float], start: datetime, end: datetime
) -> pd.Series:

# Forward fill discharge for all (hour) timestamps until start
one_hour = pd.Timedelta("1h")
if start - one_hour not in discharge.index:
Expand Down Expand Up @@ -906,7 +906,6 @@ def apply(self, time_series_data: tuple[pd.Series]) -> pd.Series:
"""
single_ts = time_series_data[0]
if single_ts.empty or self.discharge.empty:

return single_ts

if self.pre_apply_has_run:
Expand Down
4 changes: 3 additions & 1 deletion cognite/powerops/resync/core/cdf.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,9 @@ def create(self, items: T_Instance | Sequence[T_Instance]) -> Any:
else:
try:
self.client.data_modeling.instances.apply(
edges=chunk, auto_create_start_nodes=True, auto_create_end_nodes=True # type: ignore[arg-type]
edges=chunk, # type: ignore[arg-type]
auto_create_start_nodes=True,
auto_create_end_nodes=True,
)
except CogniteAPIError:
print(f"Failed on {chunk[0].dump()}")
Expand Down
3 changes: 1 addition & 2 deletions cognite/powerops/resync/v1/config_to_fdm.py
Original file line number Diff line number Diff line change
Expand Up @@ -128,7 +128,6 @@ def to_data_model(self) -> tuple[list[v1_data_classes.DomainModelWrite], list[st
external_ids: list[str] = []

for folder in self.folders_to_process:

all_data_model_files = list((folder).glob("*.yaml"))

data_model_configuration_file = folder / "data_model_configuration.yaml"
Expand Down Expand Up @@ -229,7 +228,7 @@ def _dict_to_type_object(

@staticmethod
def _split_data_by_parsing_method(
raw_data: dict[str, Any]
raw_data: dict[str, Any],
) -> tuple[dict[str, Any], dict[str, Any], dict[str, Any]]:
"""Splits an unparsed dictionary into a dictionary for each type of parsing should be applied.

Expand Down
1 change: 0 additions & 1 deletion cognite/powerops/resync/v1/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,6 @@ def get_data_model_write_classes(data_model_client: Any) -> dict[str, type]:

expected_types_mapping = {}
for read_class in data_model_read_classes.keys():

write_class_name = read_class.__name__ + "Write"

read_name = to_snake(read_class.__name__)
Expand Down
5 changes: 4 additions & 1 deletion cognite/powerops/utils/cdf/calls.py
Original file line number Diff line number Diff line change
Expand Up @@ -156,7 +156,10 @@ def retrieve_latest(client: CogniteClient, external_ids: list[Optional[str]], be


def retrieve_time_series_datapoints(
client: CogniteClient, mappings, start, end # : List[TimeSeriesMapping]
client: CogniteClient,
mappings,
start,
end, # : List[TimeSeriesMapping]
) -> dict[str, pd.Series]:
time_series_start = retrieve_latest(
client=client,
Expand Down
Loading