Skip to content

Commit

Permalink
update iceberg test
Browse files Browse the repository at this point in the history
  • Loading branch information
allisonport-db committed Feb 14, 2025
1 parent 267372a commit b89c5a2
Showing 1 changed file with 1 addition and 28 deletions.
29 changes: 1 addition & 28 deletions .github/workflows/iceberg_test.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,15 +2,14 @@ name: "Delta Iceberg Latest"
on: [push, pull_request]
jobs:
test:
name: "DSL: Scala ${{ matrix.scala }}, Shard ${{ matrix.shard }}"
name: "DIL: Scala ${{ matrix.scala }}"
runs-on: ubuntu-20.04
strategy:
matrix:
# These Scala versions must match those in the build.sbt
scala: [2.12.18, 2.13.13]
env:
SCALA_VERSION: ${{ matrix.scala }}
# Important: This must be the same as the length of shards in matrix
steps:
- uses: actions/checkout@v3
# TODO we can make this more selective
Expand Down Expand Up @@ -57,32 +56,6 @@ jobs:
pyenv install 3.8.18
pyenv global system 3.8.18
pipenv --python 3.8 install
# Update the pip version to 24.0. By default `pyenv.run` installs the latest pip version
# available. From version 24.1, `pip` doesn't allow installing python packages
# with version string containing `-`. In Delta-Spark case, the pypi package generated has
# `-SNAPSHOT` in version (e.g. `3.3.0-SNAPSHOT`) as the version is picked up from
# the`version.sbt` file.
pipenv run pip install pip==24.0 setuptools==69.5.1 wheel==0.43.0
pipenv run pip install pyspark==3.5.3
pipenv run pip install flake8==3.5.0 pypandoc==1.3.3
pipenv run pip install black==23.9.1
pipenv run pip install importlib_metadata==3.10.0
pipenv run pip install mypy==0.982
pipenv run pip install mypy-protobuf==3.3.0
pipenv run pip install cryptography==37.0.4
pipenv run pip install twine==4.0.1
pipenv run pip install wheel==0.33.4
pipenv run pip install setuptools==41.1.0
pipenv run pip install pydocstyle==3.0.0
pipenv run pip install pandas==1.1.3
pipenv run pip install pyarrow==8.0.0
pipenv run pip install numpy==1.20.3
if: steps.git-diff.outputs.diff
- name: Scala structured logging style check
run: |
if [ -f ./dev/spark_structured_logging_style.py ]; then
python3 ./dev/spark_structured_logging_style.py
fi
if: steps.git-diff.outputs.diff
- name: Run Scala/Java and Python tests
# when changing TEST_PARALLELISM_COUNT make sure to also change it in spark_master_test.yaml
Expand Down

0 comments on commit b89c5a2

Please sign in to comment.