Refactor code, introduce DataSourceMetricsMixin #2
build_main.yml
on: push
Run
/
Check changes
32s
Run
/
Protobuf breaking change detection and Python CodeGen check
1m 23s
Run
/
Run TPC-DS queries with SF=1
46m 47s
Run
/
Run Docker integration tests
1h 8m
Run
/
Run Spark on Kubernetes Integration test
55m 0s
Run
/
Run Spark UI tests
20s
Matrix: Run / build
Matrix: Run / java-other-versions
Run
/
Build modules: sparkr
25m 58s
Run
/
Linters, licenses, dependencies and documentation generation
1h 1m
Matrix: Run / pyspark
Annotations
10 errors and 2 warnings
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-24e2f88d602b0085-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-3495b78d602bdce7-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$694/0x00007f4d0c5c8000@44e0e48 rejected from java.util.concurrent.ThreadPoolExecutor@49b577b9[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 374]
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$694/0x00007f4d0c5c8000@60abad77 rejected from java.util.concurrent.ThreadPoolExecutor@49b577b9[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 375]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-1ae9f48d603d079a-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-edef398d603dee06-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-0d47218d60418176-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-981ba72ea34f4895ba6c78f7d933472f-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-981ba72ea34f4895ba6c78f7d933472f-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Protobuf breaking change detection and Python CodeGen check
Node.js 16 actions are deprecated. Please update the following actions to use Node.js 20: bufbuild/buf-setup-action@v1, bufbuild/buf-lint-action@v1, bufbuild/buf-breaking-action@v1. For more information see: https://github.blog/changelog/2023-09-22-github-actions-transitioning-from-node-16-to-node-20/.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, yarn, kubernetes, hadoop-cloud, spark-ganglia-lgpl, connect, protobuf--17-hadoop3-hive2.3
Expired
|
161 KB |
|