Update ParquetIOSuite.scala #130
build_main.yml
on: push
Run
/
Check changes
31s
Run
/
Protobuf breaking change detection and Python CodeGen check
1m 1s
Run
/
Run TPC-DS queries with SF=1
1h 17m
Run
/
Run Docker integration tests
42m 35s
Run
/
Run Spark on Kubernetes Integration test
1h 2m
Run
/
Run Spark UI tests
19s
Matrix: Run / build
Run
/
Build modules: sparkr
27m 21s
Run
/
Linters, licenses, and dependencies
27m 24s
Run
/
Documentation generation
34m 42s
Matrix: Run / pyspark
Annotations
10 errors and 7 warnings
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-6b5e978fc230074f-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-e299068fc230ee2e-exec-1".
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$621/0x00007ff0905298c0@11726b5d rejected from java.util.concurrent.ThreadPoolExecutor@3c54e5f4[Shutting down, pool size = 2, active threads = 2, queued tasks = 0, completed tasks = 406]
|
Run / Run Spark on Kubernetes Integration test
sleep interrupted
|
Run / Run Spark on Kubernetes Integration test
Task io.fabric8.kubernetes.client.utils.internal.SerialExecutor$$Lambda$621/0x00007ff0905298c0@791fc52a rejected from java.util.concurrent.ThreadPoolExecutor@3c54e5f4[Shutting down, pool size = 1, active threads = 1, queued tasks = 0, completed tasks = 407]
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-2b9a0d8fc2430eaa-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-2421688fc243f30a-exec-1".
|
Run / Run Spark on Kubernetes Integration test
HashSet() did not contain "decomtest-b345d38fc247918f-exec-1".
|
Run / Run Spark on Kubernetes Integration test
Status(apiVersion=v1, code=404, details=StatusDetails(causes=[], group=null, kind=pods, name=spark-test-app-9c27cca5e4254be7a7a41282e406e69e-driver, retryAfterSeconds=null, uid=null, additionalProperties={}), kind=Status, message=pods "spark-test-app-9c27cca5e4254be7a7a41282e406e69e-driver" not found, metadata=ListMeta(_continue=null, remainingItemCount=null, resourceVersion=null, selfLink=null, additionalProperties={}), reason=NotFound, status=Failure, additionalProperties={})..
|
Run / Run Spark on Kubernetes Integration test
Failed to download action 'https://api.github.com/repos/actions/checkout/tarball/a5ac7e51b41094c92402da3b24376905380afc29'. Error: The request was canceled due to the configured HttpClient.Timeout of 100 seconds elapsing.
|
Run / Run Spark on Kubernetes Integration test
Back off 24.533 seconds before retry.
|
Run / Run Spark on Kubernetes Integration test
Failed to download action 'https://api.github.com/repos/actions/checkout/tarball/a5ac7e51b41094c92402da3b24376905380afc29'. Error: The request was canceled due to the configured HttpClient.Timeout of 100 seconds elapsing.
|
Run / Run Spark on Kubernetes Integration test
Back off 13.107 seconds before retry.
|
Run / Build modules: pyspark-connect
Failed to download action 'https://api.github.com/repos/actions/checkout/tarball/a5ac7e51b41094c92402da3b24376905380afc29'. Error: The request was canceled due to the configured HttpClient.Timeout of 100 seconds elapsing.
|
Run / Build modules: pyspark-connect
Back off 10.8 seconds before retry.
|
Run / Build modules: pyspark-core, pyspark-errors, pyspark-streaming
No files were found with the provided path: **/target/test-reports/*.xml. No artifacts will be uploaded.
|
Artifacts
Produced during runtime
Name | Size | |
---|---|---|
test-results-streaming, sql-kafka-0-10, streaming-kafka-0-10, streaming-kinesis-asl, kubernetes, hadoop-cloud, spark-ganglia-lgpl, protobuf, connect--17-hadoop3-hive2.3
Expired
|
359 KB |
|