Skip to content

Commit

Permalink
fix/add links (#1900)
Browse files Browse the repository at this point in the history
aeluce authored Jan 27, 2025

Verified

This commit was created on GitHub.com and signed with GitHub’s verified signature.
1 parent a6ea1b7 commit 073c45e
Showing 15 changed files with 50 additions and 46 deletions.
Original file line number Diff line number Diff line change
@@ -48,18 +48,18 @@ To use this connector, you'll need a MariaDB database setup with the following.

2. Create a RDS parameter group to enable replication in MariaDB.

1. [Create a parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Creating).
1. [Create a parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Creating.html).
Create a unique name and description and set the following properties:

- **Family**: mariadb10.6
- **Type**: DB Parameter group

2. [Modify the new parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Modifying) and update the following parameters:
2. [Modify the new parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Modifying.html) and update the following parameters:

- binlog_format: ROW

3. [Associate the parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Associating)
with the database and set [Backup Retention Period](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithAutomatedBackups.html#USER_WorkingWithAutomatedBackups.Enabling) to 7 days.
3. [Associate the parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Associating.html)
with the database and set [Backup Retention Period](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithAutomatedBackups.Enabling.html) to 7 days.
Reboot the database to allow the changes to take effect.

3. Switch to your MariaDB client. Run the following commands to create a new user for the capture with appropriate permissions:
@@ -84,7 +84,7 @@ This connector supports capturing from a read replica of your database, provided
binary logging is enabled on the replica and all other requirements are met. To create
a read replica:

1. Follow RDS instructions to [create a read replica](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html#USER_ReadRepl.Create)
1. Follow RDS instructions to [create a read replica](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.Create.html)
of your MariaDB database.

2. [Modify the replica](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.DBInstance.Modifying.html)
Original file line number Diff line number Diff line change
@@ -39,18 +39,18 @@ To use this connector, you'll need a MySQL database setup with the following.

2. Create a RDS parameter group to enable replication in MySQL.

1. [Create a parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Creating).
1. [Create a parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Creating.html).
Create a unique name and description and set the following properties:

- **Family**: mysql8.0
- **Type**: DB Parameter group

2. [Modify the new parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Modifying) and update the following parameters:
2. [Modify the new parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Modifying.html) and update the following parameters:

- binlog_format: ROW

3. [Associate the parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Associating)
with the database and set [Backup Retention Period](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithAutomatedBackups.html#USER_WorkingWithAutomatedBackups.Enabling) to 7 days.
3. [Associate the parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Associating.html)
with the database and set [Backup Retention Period](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithAutomatedBackups.Enabling.html) to 7 days.
Reboot the database to allow the changes to take effect.

3. Switch to your MySQL client. Run the following commands to create a new user for the capture with appropriate permissions:
@@ -77,7 +77,7 @@ This connector supports capturing from a read replica of your database, provided
binary logging is enabled on the replica and all other requirements are met. To create
a read replica:

1. Follow RDS instructions to [create a read replica](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.html#USER_ReadRepl.Create)
1. Follow RDS instructions to [create a read replica](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_ReadRepl.Create.html)
of your MySQL database.

2. [Modify the replica](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.DBInstance.Modifying.html)
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# OracleDB
This connector captures data from OracleDB into Flow collections using [Oracle Logminer](https://docs.oracle.com/en/database/oracle/oracle-database/19/sutil/oracle-logminer-utility.html#GUID-2555A155-01E3-483E-9FC6-2BDC2D8A4093).

It is available for use in the Flow web application. For local development or open-source workflows, `ghcr.io/estuary/source-oracle:dev` provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.
It is available for use in the Flow web application. For local development or open-source workflows, [`ghcr.io/estuary/source-oracle:dev`](https://ghcr.io/estuary/source-oracle:dev) provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.

## Prerequisites
* Oracle 11g or above
@@ -153,7 +153,7 @@ To allow secure connections via SSH tunneling:

### Sample

```json
```yaml
captures:
${PREFIX}/${CAPTURE_NAME}:
endpoint:
@@ -172,7 +172,7 @@ captures:
sshForwarding:
privateKey: -----BEGIN RSA PRIVATE KEY-----\n...
sshEndpoint: ssh://[email protected]:22

bindings:
- resource:
namespace: ${TABLE_NAMESPACE}
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@
# OracleDB (Flashback)
This connector captures data from OracleDB into Flow collections using [Oracle Flashback](https://www.oracle.com/database/technologies/flashback/).

It is available for use in the Flow web application. For local development or open-source workflows, ghcr.io/estuary/source-oracle-flashback:dev provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.
It is available for use in the Flow web application. For local development or open-source workflows, [`ghcr.io/estuary/source-oracle-flashback:dev`](https://ghcr.io/estuary/source-oracle-flashback:dev) provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.

## Prerequisites
* Oracle 11g or above
@@ -48,7 +48,7 @@ GRANT SELECT ON V$DATABASE TO estuary_flow_user;
GRANT SELECT_CATALOG_ROLE TO estuary_flow_user;
```

1. Your database user should now be ready for use with Estuary Flow.
5. Your database user should now be ready for use with Estuary Flow.

### Recommended Database Configuration

@@ -97,7 +97,7 @@ To allow secure connections via SSH tunneling:

### Sample

```json
```yaml
captures:
${PREFIX}/${CAPTURE_NAME}:
endpoint:
@@ -114,7 +114,7 @@ captures:
sshForwarding:
privateKey: -----BEGIN RSA PRIVATE KEY-----\n...
sshEndpoint: ssh://[email protected]:22

bindings:
- resource:
name: ${TABLE_NAME}
Original file line number Diff line number Diff line change
@@ -42,15 +42,15 @@ You'll need a PostgreSQL database setup with the following:

2. Enable logical replication on your RDS PostgreSQL instance.

1. Create a [parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Creating).
1. Create a [parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Creating.html).
Create a unique name and description and set the following properties:

- **Family**: postgres13
- **Type**: DB Parameter group

2. [Modify the new parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Modifying) and set `rds.logical_replication=1`.
2. [Modify the new parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Modifying.html) and set `rds.logical_replication=1`.

3. [Associate the parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithDBInstanceParamGroups.html#USER_WorkingWithParamGroups.Associating) with the database.
3. [Associate the parameter group](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/USER_WorkingWithParamGroups.Associating.html) with the database.

4. Reboot the database to allow the new parameter group to take effect.

@@ -155,7 +155,7 @@ store them separately.
TOASTed values can sometimes present a challenge for systems that rely on the PostgreSQL write-ahead log (WAL), like this connector.
If a change event occurs on a row that contains a TOASTed value, _but the TOASTed value itself is unchanged_, it is omitted from the WAL.
As a result, the connector emits a row update with the a value omitted, which might cause
As a result, the connector emits a row update with the value omitted, which might cause
unexpected results in downstream catalog tasks if adjustments are not made.
The PostgreSQL connector handles TOASTed values for you when you follow the [standard discovery workflow](/concepts/connectors.md#flowctl-discover)
Original file line number Diff line number Diff line change
@@ -3,6 +3,8 @@
This connector captures data from Postgres into Flow collections by periodically
executing queries and translating the results into JSON documents.

For local development or open-source workflows, [`ghcr.io/estuary/source-postgres-batch:dev`](https://ghcr.io/estuary/source-postgres-batch:dev) provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.

We recommend using our [PostgreSQL CDC Connector](http://go.estuary.dev/source-postgres) instead
if possible. Using CDC provides lower latency data capture, delete and update events, and usually
has a smaller impact on the source database.
Original file line number Diff line number Diff line change
@@ -31,21 +31,14 @@ To capture change events from SQL Server tables using this connector, you need:
- Access to the change tables created as part of the SQL Server CDC process.
- `SELECT`, `INSERT`, and `UPDATE` permissions on the watermarks table

To meet these requirements, follow the steps for your hosting type.

- [Self-hosted SQL Server](#setup-self-hosted-sql-server)
- [Azure SQL Database](#setup-azure-sql-database)
- [Amazon RDS for SQL Server](#setup-amazon-rds-for-sql-server)
- [Google Cloud SQL for SQL Server](#setup-google-cloud-sql-for-sql-server)

## Setup

1. Allow connections between the database and Estuary Flow. There are two ways to do this: by granting direct access to Flow's IP or by creating an SSH tunnel.

1. To allow direct access:

- [Modify the database](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.DBInstance.Modifying.html), setting **Public accessibility** to **Yes**.
- Edit the VPC security group associated with your database, or create a new VPC security group and associate it with the database as described in [the Amazon documentation](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.RDSSecurityGroups.html#Overview.RDSSecurityGroups.Create).Create a new inbound rule and a new outbound rule that allow all traffic from the [Estuary Flow IP addresses](/reference/allow-ip-addresses).
- Edit the VPC security group associated with your database, or create a new VPC security group and associate it with the database as described in [the Amazon documentation](https://docs.aws.amazon.com/AmazonRDS/latest/UserGuide/Overview.RDSSecurityGroups.html#Overview.RDSSecurityGroups.Create). Create a new inbound rule and a new outbound rule that allow all traffic from the [Estuary Flow IP addresses](/reference/allow-ip-addresses).

2. To allow secure connections via SSH tunneling:
- Follow the guide to [configure an SSH server for tunneling](/guides/connect-network/)
Original file line number Diff line number Diff line change
@@ -36,15 +36,15 @@ To capture change events from SQL Server tables using this connector, you need:
- A user role with:
- `SELECT` permissions on the CDC schema and the schemas that contain tables to be captured.
- Access to the change tables created as part of the SQL Server CDC process.
- `SELECT`, `INSERT`, and `UPDATE` permissions on the watermarks table
- `SELECT`, `INSERT`, and `UPDATE` permissions on the watermarks table.

## Setup

To meet these requirements, follow the steps for your hosting type.

- [Self-hosted SQL Server](#self-hosted-sql-server)
- [Azure SQL Database](#azure-sql-database)
- [Amazon RDS for SQL Server](./amazon-rds-sqlserver/))
- [Amazon RDS for SQL Server](./amazon-rds-sqlserver/)
- [Google Cloud SQL for SQL Server](./google-cloud-sql-sqlserver/)

### Self-hosted SQL Server
Original file line number Diff line number Diff line change
@@ -3,7 +3,7 @@

This connector captures data from Aircall into Flow collections.

It is available for use in the Flow web application. For local development or open-source workflows, ghcr.io/estuary/source-aircall:dev provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.
It is available for use in the Flow web application. For local development or open-source workflows, [`ghcr.io/estuary/source-aircall:dev`](https://ghcr.io/estuary/source-aircall:dev) provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.

## Prerequisites
To set up the Aircall connector, you need the following prerequisite:
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@

This connector lets you capture data from your Google Drive account into Flow collections.

[ghcr.io/estuary/source-google-drive:dev](https://ghcr.io/estuary/source-google-drive:dev) provides the latest connector image. For access to previous image versions, follow the link in your browser.
[`ghcr.io/estuary/source-google-drive:dev`](https://ghcr.io/estuary/source-google-drive:dev) provides the latest connector image. For access to previous image versions, follow the link in your browser.

## Prerequisites

Original file line number Diff line number Diff line change
@@ -4,11 +4,13 @@
This connector captures messages in JSON format into Flow collections from
Google Cloud Pub/Sub topics.

During setup, this connect will discover all topics it has access to. Each
During setup, this connector will discover all topics it has access to. Each
[capture binding](../../../concepts/README.md#resources-and-bindings) that is
enabled for a topic will automatically create a new subscription, and the
connector will read messages from that subscription.

[`ghcr.io/estuary/source-google-pubsub:dev`](https://ghcr.io/estuary/source-google-pubsub:dev) provides the latest connector image. You can also follow the link in your browser to see past image versions.

## Prerequisites

To use this connector, you will need the following prerequisites:
Original file line number Diff line number Diff line change
@@ -4,6 +4,17 @@

This connector captures data from one LinkedIn Page into Flow collections via the [LinkedIn Marketing API](https://learn.microsoft.com/en-us/linkedin/marketing/integrations/marketing-integrations-overview?view=li-lms-2024-03).

[`ghcr.io/estuary/source-linkedin-pages:dev`](https://ghcr.io/estuary/source-linkedin-pages:dev) provides the latest connector image. You can also follow the link in your browser to see past image versions.

## Supported Streams

- [Organization Lookup](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organizations)
- [Follower Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/follower-statistics?tabs=http#retrieve-lifetime-follower-statistics)
- [Share Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/share-statistics?tabs=http#retrieve-lifetime-share-statistics)
- [Total Follower Count](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organization-follower-count)

By default, each resource is mapped to a Flow collection through a separate binding.

## Prerequisites

* An existing LinkedIn Account
@@ -60,10 +71,3 @@ See [connectors](/concepts/connectors.md#using-connectors) to learn more about u
| `/refresh_token` | Refresh Token | The token value generated using the LinkedIn Developers [OAuth Token Tools](https://www.linkedin.com/developers/tools/oauth). | string | Required |
| `/access_token` | Access Token | The token value generated using the LinkedIn Developers [OAuth Token Tools](https://www.linkedin.com/developers/tools/oauth). | string | Required |


## Supported Streams

- [Organization Lookup](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organizations)
- [Follower Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/follower-statistics?tabs=http#retrieve-lifetime-follower-statistics)
- [Share Statistics](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/share-statistics?tabs=http#retrieve-lifetime-share-statistics)
- [Total Follower Count](https://docs.microsoft.com/en-us/linkedin/marketing/integrations/community-management/organizations/organization-lookup-api?tabs=http#retrieve-organization-follower-count)
Original file line number Diff line number Diff line change
@@ -2,6 +2,9 @@

This connector captures data from Oracle NetSuite into Flow collections. It relies on the SuiteAnalytics Connect feature in order to both load large amounts of data quickly, as well as introspect the available tables, their schemas, keys, and cursor fields.

[`ghcr.io/estuary/source-netsuite:dev`](https://ghcr.io/estuary/source-netsuite:dev) provides the
latest connector image. You can also follow the link in your browser to see past image versions.

If you don't have SuiteAnalytics Connect, check out our [SuiteTalk REST](../netsuite-suitetalk) connector.

## Supported data resources
@@ -118,7 +121,7 @@ See [connectors](../../../concepts/connectors.md#using-connectors) to learn more

| Property | Title | Description | Type | Required/Default |
| ----------------------------- | ---------------------- | ------------------------------------------------------------------------------------------------ | ------ | ---------------- |
| `/account | Netsuite Account ID | Netsuite realm/Account ID e.g. 2344535, as for `production` or 2344535_SB1, as for the `sandbox` | string | Required |
| `/account` | Netsuite Account ID | Netsuite realm/Account ID e.g. 2344535, as for `production` or 2344535_SB1, as for `sandbox` | string | Required |
| `/role_id` | Role ID | The ID of the role you created. Defaults to 3, which is the ID of the administrator role. | int | 3 |
| `/suiteanalytics_data_source` | Data Source | Which NetSuite data source to use. Options are `NetSuite.com`, or `NetSuite2.com` | string | Required |
| `/authentication` | Authentication Details | Credentials to access your NetSuite account | object | Required |
Original file line number Diff line number Diff line change
@@ -2,7 +2,7 @@
# Pinterest
This connector captures data from Pinterest into Flow collections.

It is available for use in the Flow web application. For local development or open-source workflows, ghcr.io/estuary/source-pinterest:dev provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.
It is available for use in the Flow web application. For local development or open-source workflows, [`ghcr.io/estuary/source-pinterest:dev`](https://ghcr.io/estuary/source-pinterest:dev) provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.

## Prerequisites
To set up the Pinterest source connector, you'll need the following prerequisites:
@@ -141,5 +141,5 @@ The Pinterest API imposes certain rate limits for the connector. Please take not
* Boards streams: 10 calls per second per user per app

:::note
For any additional information or troubleshooting, refer to the official Pinterest API documentation.
For any additional information or troubleshooting, refer to the official [Pinterest API documentation](https://developers.pinterest.com/docs/overview/welcome/).
:::
Original file line number Diff line number Diff line change
@@ -2,10 +2,10 @@
# WooCommerce
This connector captures data from WooCommerce into Flow collections.

It is available for use in the Flow web application. For local development or open-source workflows, ghcr.io/estuary/source-woocommerce:dev provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.
It is available for use in the Flow web application. For local development or open-source workflows, [`ghcr.io/estuary/source-woocommerce:dev`](https://ghcr.io/estuary/source-woocommerce:dev) provides the latest version of the connector as a Docker image. You can also follow the link in your browser to see past image versions.

## Prerequisites
To set up the WooCommerce source connector with: you need:
To set up the WooCommerce source connector you need:

* WooCommerce 3.5+
* WordPress 4.4+

0 comments on commit 073c45e

Please sign in to comment.