Skip to content

Fix analyze of default expression in struct field to column conversion #31

Fix analyze of default expression in struct field to column conversion

Fix analyze of default expression in struct field to column conversion #31

You are viewing an older attempt in the history of this workflow run. View latest attempt.
Status Failure
Total duration 1h 55m 0s
Artifacts

build_main.yml

on: push
Run  /  Check changes
47s
Run / Check changes
Run  /  Base image build
53s
Run / Base image build
Run  /  Protobuf breaking change detection and Python CodeGen check
1m 4s
Run / Protobuf breaking change detection and Python CodeGen check
Run  /  Run TPC-DS queries with SF=1
48m 30s
Run / Run TPC-DS queries with SF=1
Run  /  Run Docker integration tests
44m 19s
Run / Run Docker integration tests
Run  /  Run Spark on Kubernetes Integration test
58m 10s
Run / Run Spark on Kubernetes Integration test
Run  /  Run Spark UI tests
19s
Run / Run Spark UI tests
Matrix: Run / build
Matrix: Run / maven-build
Run  /  Build modules: sparkr
31m 8s
Run / Build modules: sparkr
Run  /  Linters, licenses, dependencies and documentation generation
28m 46s
Run / Linters, licenses, dependencies and documentation generation
Matrix: Run / pyspark
Fit to window
Zoom out
Zoom in

Annotations

12 errors and 1 warning
Run / Build modules: yarn, connect
Process completed with exit code 18.
Run / Linters, licenses, dependencies and documentation generation
Process completed with exit code 1.
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-3cd4128f7ba6c1f6-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-e9c4028f7ba7a6d6-exec-1".
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$621/0x00007fb49c5298c0@28add34e rejected from java.util.concurrent.ThreadPoolExecutor@50cb9b63[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 351]
Run / Run Spark on Kubernetes Integration test
sleep interrupted
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$621/0x00007fb49c5298c0@15e4889 rejected from java.util.concurrent.ThreadPoolExecutor@50cb9b63[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 352]
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-f5b01a8f7bb9bcfe-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-6d315d8f7bba9cfc-exec-1".
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-d680448f7bbe378a-exec-1".
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-7d5eab026f7144f68d2f3ae1f21785cc-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-7d5eab026f7144f68d2f3ae1f21785cc-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.