Skip to content
Triggered via push December 28, 2023 10:36
Status Failure
Total duration 1h 10m 12s
Artifacts 14
Build modules: sparkr
21m 58s
Build modules: sparkr
Linters, licenses, dependencies and documentation generation
7m 53s
Linters, licenses, dependencies and documentation generation
Scala 2.13 build with SBT
10m 0s
Scala 2.13 build with SBT
Hadoop 2 build with SBT
9m 8s
Hadoop 2 build with SBT
Run TPC-DS queries with SF=1
12m 14s
Run TPC-DS queries with SF=1
Run docker integration tests
52s
Run docker integration tests
Matrix: build
Matrix: java-11-17
Matrix: pyspark
Fit to window
Zoom out
Zoom in

Annotations

20 errors and 22 warnings
Run docker integration tests
Process completed with exit code 1.
Linters, licenses, dependencies and documentation generation
Process completed with exit code 1.
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
Process completed with exit code 19.
Build modules: pyspark-core, pyspark-streaming, pyspark-ml
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-core, pyspark-streaming, pyspark-ml
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-core, pyspark-streaming, pyspark-ml
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-core, pyspark-streaming, pyspark-ml
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: sql - other tests (JDK 8, hadoop3.2, hive2.3)
Process completed with exit code 18.
Build modules: pyspark-pandas
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas-slow
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas-slow
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas-slow
The directory '/github/home/.cache/pip/http' or its parent directory is not owned by the current user and the cache has been disabled. Please check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Build modules: pyspark-pandas-slow
The directory '/github/home/.cache/pip' or its parent directory is not owned by the current user and caching wheels has been disabled. check the permissions and owner of that directory. If executing pip with sudo, you may want sudo's -H flag.
Run docker integration tests
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Run docker integration tests
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Run docker integration tests
No files were found with the provided path: **/target/unit-tests.log. No artifacts will be uploaded.
Linters, licenses, dependencies and documentation generation
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Hadoop 2 build with SBT
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Scala 2.13 build with SBT
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Run TPC-DS queries with SF=1
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: pyspark-sql, pyspark-mllib, pyspark-resource
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
Build modules: core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: sparkr
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Java 11 build with Maven
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Java 17 build with Maven
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: pyspark-core, pyspark-streaming, pyspark-ml
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: catalyst, hive-thriftserver (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: sql - slow tests (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: hive - slow tests (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: hive - other tests (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: sql - other tests (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: pyspark-pandas
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, mesos, kubernetes, hadoop-cloud, spark-ganglia-lgpl (JDK 8, hadoop3.2, hive2.3)
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-java@v1, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/
Build modules: pyspark-pandas-slow
The following actions uses node12 which is deprecated and will be forced to run on node16: actions/checkout@v2, actions/cache@v2, actions/setup-python@v2, actions/upload-artifact@v2. For more info: https://github.blog/changelog/2023-06-13-github-actions-all-actions-will-run-on-node16-instead-of-node12-by-default/

Artifacts

Produced during runtime
Name Size
test-results-catalyst, hive-thriftserver--8-hadoop3.2-hive2.3 Expired
202 KB
test-results-core, unsafe, kvstore, avro, network-common, network-shuffle, repl, launcher, examples, sketch, graphx--8-hadoop3.2-hive2.3 Expired
153 KB
test-results-hive-- other tests-8-hadoop3.2-hive2.3 Expired
914 KB
test-results-hive-- slow tests-8-hadoop3.2-hive2.3 Expired
950 KB
test-results-pyspark-core, pyspark-streaming, pyspark-ml--8-hadoop3.2-hive2.3 Expired
75.7 KB
test-results-pyspark-pandas--8-hadoop3.2-hive2.3 Expired
240 KB
test-results-pyspark-pandas-slow--8-hadoop3.2-hive2.3 Expired
159 KB
test-results-sparkr--8-hadoop3.2-hive2.3 Expired
268 KB
test-results-sql-- other tests-8-hadoop3.2-hive2.3 Expired
4.3 MB
test-results-sql-- slow tests-8-hadoop3.2-hive2.3 Expired
2.39 MB
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, mllib-local, mllib, yarn, mesos, kubernetes, hadoop-cloud, spark-ganglia-lgpl--8-hadoop3.2-hive2.3 Expired
1.77 MB
test-results-tpcds--8-hadoop3.2-hive2.3 Expired
22.7 KB
unit-tests-log-pyspark-sql, pyspark-mllib, pyspark-resource--8-hadoop3.2-hive2.3 Expired
241 MB
unit-tests-log-sql-- other tests-8-hadoop3.2-hive2.3 Expired
424 MB