[MINOR] Fix usages of orElse #30549
Triggered via pull request
January 10, 2024 18:17
Status
Failure
Total duration
2h 29m 28s
Artifacts
–
bot.yml
on: pull_request
validate-source
31s
Matrix: docker-java17-test
Matrix: integration-tests
Matrix: test-flink
Matrix: test-hudi-hadoop-mr-and-hudi-java-client
Matrix: test-spark-java17
Matrix: test-spark
Matrix: validate-bundles
Matrix: validate-release-candidate-bundles
Annotations
20 errors and 60 warnings
test-spark (scala-2.12, spark3.0, hudi-spark-datasource/hudi-spark3.0.x)
Failed to create marker type file /tmp/hoodie_test_path4122680067098188766/.hoodie/.temp/20240110194657536/MARKERS.type; java.lang.InterruptedException
|
test-spark (scala-2.12, spark3.0, hudi-spark-datasource/hudi-spark3.0.x)
Process completed with exit code 1.
|
test-spark (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The job was canceled because "scala-2_12_spark3_0_hudi-" failed.
|
test-spark (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
The operation was canceled.
|
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
The job was canceled because "scala-2_12_spark3_0_hudi-" failed.
|
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
The operation was canceled.
|
test-spark (scala-2.12, spark3.1, hudi-spark-datasource/hudi-spark3.1.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The job was canceled because "scala-2_12_spark3_0_hudi-" failed.
|
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
The operation was canceled.
|
test-spark (scala-2.11, spark2.4, hudi-spark-datasource/hudi-spark2)
The job was canceled because "scala-2_12_spark3_0_hudi-" failed.
|
test-spark (scala-2.11, spark2.4, hudi-spark-datasource/hudi-spark2)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.11, spark2.4, hudi-spark-datasource/hudi-spark2)
The operation was canceled.
|
test-spark (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The job was canceled because "scala-2_12_spark3_0_hudi-" failed.
|
test-spark (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
|
test-spark (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
The operation was canceled.
|
test-spark-java17 (scala-2.12, spark3.5, hudi-spark-datasource/hudi-spark3.5.x)
Cannot resolve conflicts for overlapping writes
|
test-spark-java17 (scala-2.12, spark3.4, hudi-spark-datasource/hudi-spark3.4.x)
Cannot resolve conflicts for overlapping writes
|
test-spark-java17 (scala-2.12, spark3.3, hudi-spark-datasource/hudi-spark3.3.x)
Cannot resolve conflicts for overlapping writes
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh Building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh Done building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh copying hadoop conf
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:1
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:2
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting datanode:3
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh Running tests with Java 17
|
docker-java17-test (flink1.18, spark3.5, spark3.5.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh Building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh Done building Hudi with Java 8
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh copying hadoop conf
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:1
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:2
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting datanode:3
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh starting hadoop hdfs, hdfs report
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh Running tests with Java 17
|
docker-java17-test (flink1.18, spark3.4, spark3.4.0)
docker_test_java17.sh run_docker_tests Running Hudi maven tests on Docker
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.16, spark3.3, spark3.3.1)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.18, spark3.4, spark3.4.0)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.18, spark3.5, spark3.5.0)
validate.sh validating utilities slim bundle
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh validating spark & hadoop-mr bundle
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh setting up hive metastore for spark & hadoop-mr bundles validation
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh Writing sample data via Spark DataSource and run Hive Sync...
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh Query and validate the results using Spark SQL
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh Query and validate the results using HiveQL
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
Use default java runtime under /opt/java/openjdk
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh spark & hadoop-mr bundles validation was successful.
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh done validating spark & hadoop-mr bundle
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh skip validating utilities bundle for non-spark2.4 & non-spark3.1 build
|
validate-bundles (flink1.17, spark3.3, spark3.3.2)
validate.sh validating utilities slim bundle
|