diff --git a/documentation/src/docs/prepare-development-environment.md b/documentation/src/docs/prepare-development-environment.md index 3317bf9..1679aa4 100644 --- a/documentation/src/docs/prepare-development-environment.md +++ b/documentation/src/docs/prepare-development-environment.md @@ -9,6 +9,7 @@ you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 --> +import versions from '@site/versions'; import { IfHaveFeature, IfMissingFeature, SDPVersion } from 'nautilus-docs'; import PrepareDevelopmentEnvironmentPrefix from '../snippets/spark-connectors/prepare-development-environment-prefix.md'; @@ -36,15 +37,15 @@ import PrepareDevelopmentEnvironmentPrefix from '../snippets/spark-connectors/pr This will run a development instance of Pravega locally. The transaction parameters allow transactions to remain open for up to 30 days without lease renewals. -```shell -cd +

+{`cd
 git clone https://github.com/pravega/pravega
 cd pravega
-git checkout r0.9
-./gradlew startStandalone \
-    -Dcontroller.transaction.lease.count.max=2592000000 \
-    -Dcontroller.transaction.execution.timeBound.days=30
-```
+git checkout ${versions['pravega-branch']}
+./gradlew startStandalone \\
+    -Dcontroller.transaction.lease.count.max=2592000000 \\
+    -Dcontroller.transaction.execution.timeBound.days=30`}
+
## Install Apache Spark diff --git a/documentation/src/docs/tutorial-1-writing-to-pravega.md b/documentation/src/docs/tutorial-1-writing-to-pravega.md index dc9a465..ff25ae2 100644 --- a/documentation/src/docs/tutorial-1-writing-to-pravega.md +++ b/documentation/src/docs/tutorial-1-writing-to-pravega.md @@ -9,6 +9,7 @@ you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 --> +import versions from '@site/versions'; A simple Python Spark (PySpark) application will consist of a single `.py` file. Our first application will be [stream_generated_data_to_pravega.py](https://github.com/pravega/pravega-samples/blob/spark-connector-examples/spark-connector-examples/src/main/python/stream_generated_data_to_pravega.py) and it will continuously write a timestamp to a Pravega stream. @@ -90,15 +91,15 @@ Follow these steps to run this application locally and write to your local devel 2. To run this application locally and write to your local development installation of Pravega, we'll use `spark-submit --master 'local[2]'`. This will start a Spark mini-cluster on your local system and use 2 CPUs. - ```shell - spark-submit \ - --master 'local[2]' \ - --driver-memory 4g \ - --executor-memory 4g \ - --total-executor-cores 1 \ - --packages io.pravega:pravega-connectors-spark-3.0_2.12:0.9.0 \ - stream_generated_data_to_pravega.py - ``` +

+{`spark-submit \\
+  --master 'local[2]' \\
+  --driver-memory 4g \\
+  --executor-memory 4g \\
+  --total-executor-cores 1 \\
+  --packages io.pravega:pravega-connectors-spark-3.0_2.12:${versions['spark-connectors']} \\
+  stream_generated_data_to_pravega.py`}
+
This job will continue to run and write events until stopped. diff --git a/documentation/src/docs/tutorial-3-writing-to-pravega-java.md b/documentation/src/docs/tutorial-3-writing-to-pravega-java.md index 121ba15..d52c262 100644 --- a/documentation/src/docs/tutorial-3-writing-to-pravega-java.md +++ b/documentation/src/docs/tutorial-3-writing-to-pravega-java.md @@ -9,6 +9,7 @@ you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 --> +import versions from '@site/versions'; In this tutorial, we will create a Java version of `stream_generated_data_to_pravega.py` which was described in [Tutorial 1](tutorial-1-writing-to-pravega.md). @@ -92,16 +93,16 @@ Follow these steps to run this application locally and write to your local devel 1. Run spark-submit. - ```shell - spark-submit \ - --master 'local[2]' \ - --driver-memory 4g \ - --executor-memory 4g \ - --total-executor-cores 1 \ - --packages io.pravega:pravega-connectors-spark-3.0_2.12:0.9.0 \ - --class GeneratedDataToPravega \ - build/libs/my-spark-app.jar - ``` +

+{`spark-submit \\
+  --master 'local[2]' \\
+  --driver-memory 4g \\
+  --executor-memory 4g \\
+  --total-executor-cores 1 \\
+  --packages io.pravega:pravega-connectors-spark-3.0_2.12:${versions['spark-connectors']} \\
+  --class GeneratedDataToPravega \\
+  build/libs/my-spark-app.jar`}
+
This job will continue to run and write events until stopped.