Skip to content

Commit

Permalink
chore: Refactor/replace black flake8 with ruff (#359)
Browse files Browse the repository at this point in the history
* build(pyproject): replaced black and flake8 with ruff

* style: applied automatic Ruff linting and formatting

* style: Manual fixes of ruff linting results

* ci: Linting and formatting on pull request

* chore(pre-commit): Linting and Formatting runs on pre-commit and pre-push hooks

* fix: Tests failing due to missing imports

Ruff thought that the imports in context were unused.

* fix: noqa on pep-8 naming for filters

Not sure if this solves the error, I have renamed the file locally, but
for some reason it doesn't propagate to GH

* chore: rename filters and add class def
  • Loading branch information
sean-sinclair authored Dec 20, 2024
1 parent 4f45fdc commit 578d7d9
Show file tree
Hide file tree
Showing 42 changed files with 689 additions and 416 deletions.
28 changes: 28 additions & 0 deletions .github/workflows/check_formatting.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
name: Check formatting and linting

on:
pull_request:
push: { branches: [main] }

jobs:
ruff-check:
name: Run ruff lint and format checks
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- uses: actions/setup-python@v5
with:
python-version: '3.11'
cache: 'pip'

- name: Installing dependencies
run: pip install ruff

- name: Run ruff lint
run: ruff check .

- name: Run ruff format
run: ruff format . --check

16 changes: 16 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
repos:
- repo: local
hooks:
- id: lint
name: Ruff Lint
description: Linting using ruff
entry: bash -c 'ruff check .'
language: system
stages: ["pre-commit", "pre-push"]

- id: format
name: Ruff Format
description: Formatting using ruff
entry: bash -c 'ruff format . --check'
language: system
stages: ["pre-commit", "pre-push"]
8 changes: 4 additions & 4 deletions docs/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,9 +39,9 @@
"pandas",
]

os.environ[
"SPHINX_APIDOC_OPTIONS"
] = "members,show-inheritance,inherited-members"
os.environ["SPHINX_APIDOC_OPTIONS"] = (
"members,show-inheritance,inherited-members"
)

apidoc_module_dir = "../src/fmu"
apidoc_output_dir = "apiref"
Expand Down Expand Up @@ -83,4 +83,4 @@
# Output file base name for HTML help builder.
htmlhelp_basename = "fmu-sumo"

html_logo = "_static/equinor-logo2.jpg"
html_logo = "_static/equinor-logo2.jpg"
8 changes: 4 additions & 4 deletions docs/explorer.rst
Original file line number Diff line number Diff line change
Expand Up @@ -255,16 +255,16 @@ the ``has`` filter to find cases that have ``4d-seismic`` data:

.. code-block:: python
from fmu.sumo.explorer import Explorer, Filters
from fmu.sumo.explorer import Explorer, filters
exp = Explorer(env="prod")
cases = exp.cases.filter(asset="Heidrun", has=Filters.seismic4d)
cases = exp.cases.filter(asset="Heidrun", has=filters.seismic4d)
In this case, we have a predefined filter for ``4d-seismic``, exposed
thorugh ``fmu.sumo.explorer.Filters``. There is no magic involved; any
thorugh ``fmu.sumo.explorer.filters``. There is no magic involved; any
user can create their own filters, and either use them directly or ask
for them to be added to ``fmu.sumo.explorer.Filters``.
for them to be added to ``fmu.sumo.explorer.filters``.

It is also possible to chain filters. The previous example could also
be handled by
Expand Down
34 changes: 24 additions & 10 deletions examples/explorer.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -43,11 +43,11 @@
"outputs": [],
"source": [
"# Get Drogon cases\n",
"myassetname = \"Drogon\" # Must be a valid asset on Sumo\n",
"myassetname = \"Drogon\" # Must be a valid asset on Sumo\n",
"cases = sumo.cases.filter(asset=myassetname)\n",
"\n",
"# Get available status filters\n",
"print(\"Statuses:\",cases.statuses)\n",
"print(\"Statuses:\", cases.statuses)\n",
"\n",
"# Filter on status\n",
"cases = cases.filter(status=\"keep\")\n",
Expand All @@ -67,11 +67,11 @@
" print(\"\\n\")\n",
"\n",
"# Get case by name (name is not guaranteed to be unique)\n",
"mycasename = cases[0].name # for sake of example\n",
"mycasename = cases[0].name # for sake of example\n",
"case = sumo.cases.filter(name=mycasename)[0]\n",
"\n",
"# Get case by id\n",
"mycaseuuid = cases[0].uuid # for sake of example\n",
"mycaseuuid = cases[0].uuid # for sake of example\n",
"case = sumo.cases.filter(uuid=mycaseuuid)[0]\n",
"\n",
"# Select case\n",
Expand Down Expand Up @@ -210,7 +210,7 @@
"\n",
"layout = openvds.getLayout(openvds_handle)\n",
"channel_count = layout.getChannelCount()\n",
"print(\"Channel count: \", channel_count)\n",
"print(\"Channel count: \", channel_count)\n",
"print(\"Channel names: \")\n",
"for i in range(channel_count):\n",
" print(\" \", layout.getChannelName(i))"
Expand All @@ -232,7 +232,7 @@
"source": [
"# Perform aggregation on SurfaceCollection\n",
"\n",
"regsurf = surfs.min() # .max(), .mean(), .std(), .p10(), .p90(), .p50()\n",
"regsurf = surfs.min() # .max(), .mean(), .std(), .p10(), .p90(), .p50()\n",
"regsurf.to_regular_surface().quickplot()"
]
},
Expand Down Expand Up @@ -286,23 +286,37 @@
"\n",
"\n",
"# get surfaces with timestamp in range\n",
"time = TimeFilter(time_type=TimeType.TIMESTAMP, start=\"2018-01-01\", end=\"2022-01-01\")\n",
"time = TimeFilter(\n",
" time_type=TimeType.TIMESTAMP, start=\"2018-01-01\", end=\"2022-01-01\"\n",
")\n",
"surfs = case.surfaces.filter(time=time)\n",
"\n",
"# get surfaces with time intervals in range\n",
"time = TimeFilter(time_type=TimeType.INTERVAL, start=\"2018-01-01\", end=\"2022-01-01\")\n",
"time = TimeFilter(\n",
" time_type=TimeType.INTERVAL, start=\"2018-01-01\", end=\"2022-01-01\"\n",
")\n",
"surfs = case.surfaces.filter(time=time)\n",
"\n",
"# get surfaces where intervals overlap with range\n",
"time = TimeFilter(time_type=TimeType.INTERVAL, start=\"2018-01-01\", end=\"2022-01-01\", overlap=True)\n",
"time = TimeFilter(\n",
" time_type=TimeType.INTERVAL,\n",
" start=\"2018-01-01\",\n",
" end=\"2022-01-01\",\n",
" overlap=True,\n",
")\n",
"surfs = case.surfaces.filter(time=time)\n",
"\n",
"# get surfaces with exact timestamp matching (t0 == start)\n",
"time = TimeFilter(time_type=TimeType.TIMESTAMP, start=\"2018-01-01\", exact=True)\n",
"surfs = case.surfaces.filter(time=time)\n",
"\n",
"# get surfaces with exact interval matching (t0 == start AND t1 == end)\n",
"time = TimeFilter(time_type=TimeType.INTERVAL, start=\"2018-01-01\", end=\"2022-01-01\", exact=True)\n",
"time = TimeFilter(\n",
" time_type=TimeType.INTERVAL,\n",
" start=\"2018-01-01\",\n",
" end=\"2022-01-01\",\n",
" exact=True,\n",
")\n",
"surfs = case.surfaces.filter(time=time)"
]
},
Expand Down
Loading

0 comments on commit 578d7d9

Please sign in to comment.