Skip to content

Commit

Permalink
DOCSP-31213 - streaming config (#183)
Browse files Browse the repository at this point in the history
  • Loading branch information
mongoKart authored Nov 14, 2023
1 parent eb8f1f0 commit 7ab9ace
Show file tree
Hide file tree
Showing 48 changed files with 2,004 additions and 1,524 deletions.
7 changes: 6 additions & 1 deletion config/redirects
Original file line number Diff line number Diff line change
Expand Up @@ -61,4 +61,9 @@ raw: ${prefix}/sparkR -> ${base}/v3.0/r-api/
[*-v3.0]: ${prefix}/${version}/configuration/read -> ${base}/${version}/
[*-v3.0]: ${prefix}/${version}/write-to-mongodb -> ${base}/${version}/
[*-v3.0]: ${prefix}/${version}/read-from-mongodb -> ${base}/${version}/
[*-v3.0]: ${prefix}/${version}/structured-streaming -> ${base}/${version}/
[*-v3.0]: ${prefix}/${version}/structured-streaming -> ${base}/${version}/
[v10.0-*]: ${prefix}/${version}/configuration/write -> ${base}/${version}/batch-mode/batch-write-config/
[v10.0-*]: ${prefix}/${version}/configuration/read -> ${base}/${version}/batch-mode/batch-read-config/
[v10.0-*]: ${prefix}/${version}/write-to-mongodb -> ${base}/${version}/batch-mode/batch-write/
[v10.0-*]: ${prefix}/${version}/read-from-mongodb -> ${base}/${version}/batch-mode/batch-read/
[v10.0-*]: ${prefix}/${version}/structured-streaming -> ${base}/${version}/streaming-mode/
10 changes: 9 additions & 1 deletion snooty.toml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,15 @@ title = "Spark Connector"

intersphinx = ["https://www.mongodb.com/docs/manual/objects.inv"]

toc_landing_pages = ["configuration"]
toc_landing_pages = [
"configuration",
"/batch-mode",
"/streaming-mode",
"/streaming-mode/streaming-read",
"/streaming-mode/streaming-write",
"/batch-mode/batch-write",
"/batch-mode/batch-read",
]

[constants]
connector-short = "Spark Connector"
Expand Down
32 changes: 32 additions & 0 deletions source/batch-mode.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
==========
Batch Mode
==========

.. contents:: On this page
:local:
:backlinks: none
:depth: 1
:class: singlecol

.. toctree::

/batch-mode/batch-read
/batch-mode/batch-write

Overview
--------

In batch mode, you can use the Spark Dataset and DataFrame APIs to process data at
a specified time interval.

The following sections show you how to use the {+connector-short+} to read data from
MongoDB and write data to MongoDB in batch mode:

- :ref:`batch-read-from-mongodb`
- :ref:`batch-write-to-mongodb`

.. tip:: Apache Spark Documentation

To learn more about using Spark to process batches of data, see the
`Spark Programming Guide
<https://spark.apache.org/docs/latest/sql-programming-guide.html>`__.
Loading

0 comments on commit 7ab9ace

Please sign in to comment.