Skip to content

feat: Updated .github/workflows/github-actions-bas #470

feat: Updated .github/workflows/github-actions-bas

feat: Updated .github/workflows/github-actions-bas #470

name: Build and test
on:
push:
branches:
- 'main'
- '**'
- '!branch-*.*'
pull_request:
types: [opened, reopened, edited]
jobs:
# Build: build and run the tests for specified modules.
build:
runs-on: ubuntu-22.04
strategy:
fail-fast: false
env:
SPARK_VERSION: ${{ matrix.spark }}
steps:
- uses: coursier/cache-action@v6
- name: sbt
run: |
sudo apt-get update
sudo apt-get install -y apt-transport-https curl gnupg -yqq
sudo apt-get update
sudo apt-get install -y apt-transport-https curl gnupg -yqq

Check failure on line 27 in .github/workflows/github-actions-basic.yml

View workflow run for this annotation

GitHub Actions / .github/workflows/github-actions-basic.yml

Invalid workflow file

You have an error in your yaml syntax on line 27
curl -sL "https://keyserver.ubuntu.com/pks/lookup?op=get&search=0x2EE0EA64E40A89B84B2DF73499E82A75642AC823" | sudo -H gpg --no-default-keyring --keyring gnupg-ring:/etc/apt/trusted.gpg.d/scalasbt-release.gpg --import
echo "deb https://repo.scala-sbt.org/scalasbt/debian /" | sudo tee /etc/apt/sources.list.d/sbt.list
sudo apt-get update
sudo apt-get install -y sbt
sudo apt-get install -y sbt
- name: Checkout
uses: actions/checkout@v3
with:
fetch-depth: 0
# Install python deps
- name: Install python deps
run: pip install -r requirements.txt
- name: Make pip python programs findable
run: export PATH=`python3 -m site --user-base`/bin:$PATH
- name: SQL Module - Lint Checks
run: flake8 --max-line-length 100 --ignore=E129,W504 sql/ --ignore=E129,W504 sql/ --ignore=E129,W504
- name: cd sql; pip install -e sql/
pytest sql/
run: cd sql; pip install -e .; pytest .
# Run the scala tests.
# We are exctracting the dynamic version from something like [info] 0.1.9+18-ddfaf3e6-SNAPSHOT (non-snapshots will not have +...)
- name: Run scalafix sbt Spark 2.3.2 tests
run: cd scalafix; sbt ";clean;compile;test" -DsparkVersion=2.3.2 ;cd ..
- name: Run scalafix sbt Spark 2.1.1 tests
run: cd scalafix; sbt ";clean;compile;test" -DsparkVersion=2.3.2 ;cd ..
- name: Run sbt tests on scalafix & extract the dynver
run: cd scalafix; sbt ";clean;compile;test;publishLocal;+publishLocal"; sbt "show rules/dynver" |grep "\[info\]" |grep "[0-9].[0-9].[0-9]" | cut -f 2 -d " " > ~/rules_version
- name: Run sbt tests on our WAP plugin
run: cd iceberg-spark-upgrade-wap-plugin; sbt ";clean;test"
- name: PySparkler - Make Install
run: |
cd pysparkler
pip install .
- name: PySparkler - Make Lint
run: |
cd pysparkler
flake8 .
- name: PySparkler - Make Test
run: |
cd pysparkler
pytest .
run: |
cd pysparkler
pytest .
- name: Cache tgzs
id: cache-tgz
uses: actions/cache@v3
with:
path: e2e_demo/scala/*.tgz
key: ${{ runner.os }}-tgz
- name: Cache tar.gz
id: cache-targz
uses: actions/cache@v3
with:
path: e2e_demo/scala/*.tar.gz
key: ${{ runner.os }}-targz
- name: Cache extractions spark 3
id: cache-extract-spark3
uses: actions/cache@v3
with:
path: e2e_demo/scala/spark-3.3.1-bin-hadoop2
key: ${{ runner.os }}-extract-spark3
- name: Cache extractions hadoop
id: cache-extract-hadoop
uses: actions/cache@v3
with:
path: e2e_demo/scala/hadoop-2*
key: ${{ runner.os }}-extract-hadoop2
# Run the sbt e2e demo with the local version
- name: sbt e2e demo
run: cd e2e_demo/scala; SCALAFIX_RULES_VERSION=$(cat ~/rules_version) NO_PROMPT="yes" ./run_demo.sh
# Run the gradle e2e demo with the local version
- name: gradle e2e demo
run: cd e2e_demo/scala; SCALAFIX_RULES_VERSION=$(cat ~/rules_version) NO_PROMPT="yes" ./run_demo-gradle.sh