Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Migrate to poetry #41

Merged
merged 3 commits into from
Aug 18, 2022
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
82 changes: 21 additions & 61 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,85 +10,45 @@ jobs:
validate_code:
name: Validate code in spark container
runs-on: ubuntu-latest
if: ${{ github.ref != 'refs/heads/main' }}

container:
image: esdcrproduction.azurecr.io/spark:v1.1.13-bitnami-3.1.2-python-3.9.6-0
image: esdcrproduction.azurecr.io/spark:v1.3.2-bitnami-3.2.0-python-3.9.7-0
credentials:
username: ${{ secrets.AZCR_PROD_USER }}
password: ${{ secrets.AZCR_PROD_TOKEN }}
options: -u root -w /opt/bitnami/spark --mount type=tmpfs,destination=/home/spark

steps:
- uses: actions/checkout@v2
- name: Prepare venv
- name: Install Poetry
run: |
set -e

python -m virtualenv hadoopwrapper
. hadoopwrapper/bin/activate
pip install -r ./requirements.txt
pip install -r ./requirements-dev.txt
- name: Lint
curl -sSL https://install.python-poetry.org | python3 - --preview
- name: Install Dependencies
run: |
set -e

pypath=$(pwd)
export PYTHONPATH="$pypath/src:$PYTHONPATH"

. hadoopwrapper/bin/activate
find ./src/hadoop_fs_wrapper -type f -name "*.py" | xargs pylint
- name: Unit test
/github/home/.local/bin/poetry install
- name: Lint
run: |
set -e

pypath=$(pwd)
export PYTHONPATH="$pypath/src:$PYTHONPATH"

. hadoopwrapper/bin/activate
pytest ./test

create_release:
name: Create Release
runs-on: ubuntu-latest
needs: [ validate_code ]
if: ${{ github.ref == 'refs/heads/main' }}

steps:
- uses: actions/checkout@v2
- run: git fetch --prune --unshallow
- name: Create Release
uses: SneaksAndData/github-actions/[email protected]
with:
major_v: 0
minor_v: 4

release_to_pypi_test:
name: Release distribution to test.pypi.org
runs-on: ubuntu-latest
needs: [ create_release ]
pypath=$(pwd)
export PYTHONPATH="$pypath:$PYTHONPATH"

steps:
- uses: actions/checkout@v2
- run: git fetch --prune --unshallow
- uses: actions/setup-python@v2
with:
python-version: '3.8.x'
- name: Build wheel
find ./hadoop_fs_wrapper -type f -name "*.py" | xargs /github/home/.local/bin/poetry run pylint
- name: Unit test
run: |
set -e

version=$(git describe --tags --abbrev=7)

pip install virtualenv
python -m virtualenv hadoopwrapper

. hadoopwrapper/bin/activate
pip install --upgrade twine build

echo "__version__ = '$version'" > ./src/hadoop_fs_wrapper/_version.py

python -m build --sdist --wheel
- name: Publish distribution 📦 to Test PyPI
uses: pypa/gh-action-pypi-publish@master

pypath=$(pwd)
export PYTHONPATH="$pypath:$PYTHONPATH"

/github/home/.local/bin/poetry run pytest ./test --doctest-modules --junitxml=junit/test-results.xml --cov=. --cov-report=term-missing:skip-covered | tee pytest-coverage.txt
- name: Publish Code Coverage
uses: MishaKav/pytest-coverage-comment@main
with:
password: ${{ secrets.PYPI_TEST_API_TOKEN }}
repository_url: https://test.pypi.org/legacy/
pytest-coverage-path: ./pytest-coverage.txt
junitxml-path: ./junit/test-results.xml
34 changes: 0 additions & 34 deletions .github/workflows/deploy.yaml

This file was deleted.

19 changes: 19 additions & 0 deletions .github/workflows/prepare_release.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
name: Prepare GH Release

on: workflow_dispatch

jobs:
create_release:
name: Create Release
runs-on: ubuntu-latest
if: ${{ github.ref == 'refs/heads/main' }}

steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Create Release
uses: SneaksAndData/github-actions/[email protected]
with:
major_v: 0
minor_v: 4
51 changes: 51 additions & 0 deletions .github/workflows/release.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Release a new version

on: workflow_dispatch
jobs:
release_to_pypi:
name: Release distribution to PyPi
runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v2
with:
fetch-depth: 0
- uses: actions/setup-python@v2
with:
python-version: '3.8.x'
architecture: 'x64'
- name: Install Poetry and prepare version
run: |
set -e

curl -sSL https://install.python-poetry.org | python3 - --preview

version=$(git describe --tags --abbrev=7)
sed -i "s/version = \"0.0.0\"/version = \"${version:1}\"/" pyproject.toml
echo "__version__ = '${version:1}'" > ./hadoop_fs_wrapper/_version.py

- name: Configure Test PyPi
if: ${{ github.ref == 'refs/heads/main' }}
env:
PYPI_TEST_TOKEN: ${{ secrets.PYPI_TEST_API_TOKEN }}
run: |
set -e

poetry config repositories.test-pypi https://test.pypi.org/legacy/
poetry config pypi-token.test-pypi $PYPI_TEST_TOKEN

- name: Publish distribution 📦 to test PyPI
if: ${{ github.ref == 'refs/heads/main' }}
run: |
set -e

poetry build && poetry publish -r test-pypi

- name: Publish distribution 📦 to PyPI
env:
POETRY_PYPI_TOKEN_PYPI: ${{ secrets.PYPI_API_TOKEN }}
if: ${{ startsWith(github.ref, 'refs/tags') }}
run: |
set -e

poetry build && poetry publish
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,9 @@ Select a version that matches hadoop version you are using:

## Usage
Common use case is accessing Hadoop FileSystem from Spark session object:

```python
from hadoop_fs_wrapper.wrappers.file_system import FileSystem
from hadoop_fs_wrapper.wrappers import FileSystem

file_system = FileSystem.from_spark_session(spark=spark_session)
```
Expand Down
File renamed without changes.
File renamed without changes.
File renamed without changes.
22 changes: 22 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,22 @@
[tool.poetry]
name = "hadoop-fs-wrapper"
version = "0.0.0"
description = "Python Wrapper for Hadoop Java API"
authors = ["ECCO Sneaks & Data <[email protected]>"]
maintainers = ['GZU <[email protected]>', 'JRB <[email protected]>']
license = 'MIT'
readme = "README.md"
repository = 'https://github.com/SneaksAndData/hadoop-fs-wrapper'

[tool.poetry.dependencies]
python = "^3.8"
pyspark = "~3.2"

[tool.poetry.dev-dependencies]
pytest = "^7.0"
pytest-cov = "^2.12"
pylint = "^2.12"

[build-system]
requires = ["poetry-core>=1.0.0"]
build-backend = "poetry.core.masonry.api"
3 changes: 0 additions & 3 deletions requirements-dev.txt

This file was deleted.

1 change: 0 additions & 1 deletion requirements.txt

This file was deleted.

38 changes: 0 additions & 38 deletions setup.py

This file was deleted.

2 changes: 1 addition & 1 deletion test/test_file_system.py
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@
import pytest
from pyspark.sql import SparkSession

from hadoop_fs_wrapper.wrappers.file_system import FileSystem
from hadoop_fs_wrapper.wrappers import FileSystem


@pytest.fixture
Expand Down
4 changes: 2 additions & 2 deletions test/test_parse_hadoop_filestatus.py
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,8 @@


from datetime import datetime
from hadoop_fs_wrapper.models.hadoop_file_status import HadoopFileStatus
from hadoop_fs_wrapper.models.file_status import FileStatus
from hadoop_fs_wrapper.models import HadoopFileStatus
from hadoop_fs_wrapper.models import FileStatus


class MockPath:
Expand Down