Skip to content
This repository has been archived by the owner on Sep 4, 2024. It is now read-only.

Commit

Permalink
configurable test on docker (#344)
Browse files Browse the repository at this point in the history
Signed-off-by: Pawel Leszczynski <[email protected]>
  • Loading branch information
pawel-big-lebowski authored Jul 19, 2024
1 parent 9491d7c commit 6ad6e8c
Showing 1 changed file with 15 additions and 0 deletions.
15 changes: 15 additions & 0 deletions docs/integrations/spark/testing.md
Original file line number Diff line number Diff line change
Expand Up @@ -71,6 +71,21 @@ the config has to match images available.
* `sparkConf` can be used to pass any spark configuration entries. OpenLineage transport defined is file based with a specified file location and is set within the test being run. Those settings should not be overrider.
* `packages` lets define custom jar packages to be installed with `spark-submit` command.

As of version 1.18, Spark configuration can accept instead of `sparkVersion`, a configuration
entries to determine Docker image to be run on:
```yaml
appName: "CLI test application"
docker:
image: "apache/spark:3.3.3-scala2.12-java11-python3-ubuntu"
sparkSubmit: /opt/spark/bin/spark-submit
waitForLogMessage: ".*ShutdownHookManager: Shutdown hook called.*"
scalaBinaryVersion: 2.12
```
where:
* `image` specifies docker image to be used to run Spark job,
* `sparkSubmit` is file location of `spark-submit` command,
* `waitForLogMessage` is regex for log entry determining a Spark job is finished.

### Tests definition directories

* Specified test directory should contain one ore more directories and each of the subdirectories contains separate test definition.
Expand Down

0 comments on commit 6ad6e8c

Please sign in to comment.