Skip to content

Commit e03bf99

Browse files
authored
TiDB Cloud serverless export update (#19824)
1 parent 1a7fa5d commit e03bf99

File tree

3 files changed

+115
-29
lines changed

3 files changed

+115
-29
lines changed
Loading

tidb-cloud/serverless-export.md

Lines changed: 75 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -42,39 +42,31 @@ Exporting data to a local file has the following limitations:
4242

4343
To export data to Amazon S3, you need to provide the following information:
4444

45-
- URI: `s3://<bucket-name>/<file-path>`
45+
- URI: `s3://<bucket-name>/<folder-path>/`
4646
- One of the following access credentials:
4747
- [An access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html): make sure the access key has the `s3:PutObject` and `s3:ListBucket` permissions.
48-
- [A role ARN](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html): make sure the role ARN has the `s3:PutObject` and `s3:ListBucket` permissions.
48+
- [A role ARN](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html): make sure the role ARN (Amazon Resource Name) has the `s3:PutObject` and `s3:ListBucket` permissions.
4949

5050
For more information, see [Configure External Storage Access for TiDB Cloud Serverless](/tidb-cloud/serverless-external-storage.md#configure-amazon-s3-access).
5151

5252
### Google Cloud Storage
5353

5454
To export data to Google Cloud Storage, you need to provide the following information:
5555

56-
- URI: `gs://<bucket-name>/<file-path>`
56+
- URI: `gs://<bucket-name>/<folder-path>/`
5757
- Access credential: a **base64 encoded** [service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) for your bucket. Make sure the service account key has the `storage.objects.create` permission.
5858

5959
For more information, see [Configure External Storage Access for TiDB Serverless](/tidb-cloud/serverless-external-storage.md#configure-gcs-access).
6060

61-
> **Note:**
62-
>
63-
> Currently, you can only export to Google Cloud Storage using [TiDB Cloud CLI](/tidb-cloud/cli-reference.md).
64-
6561
### Azure Blob Storage
6662

6763
To export data to Azure Blob Storage, you need to provide the following information:
6864

69-
- URI: `azure://<account-name>.blob.core.windows.net/<container-name>/<file-path>`
65+
- URI: `azure://<account-name>.blob.core.windows.net/<container-name>/<folder-path>/` or `https://<account-name>.blob.core.windows.net/<container-name>/<folder-path>/`
7066
- Access credential: a [shared access signature (SAS) token](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview) for your Azure Blob Storage container. Make sure the SAS token has the `Read` and `Write` permissions on the `Container` and `Object` resources.
7167

7268
For more information, see [Configure External Storage Access for TiDB Serverless](/tidb-cloud/serverless-external-storage.md#configure-azure-blob-storage-access).
7369

74-
> **Note:**
75-
>
76-
> Currently, you can only export to Azure Blob Storage using [TiDB Cloud CLI](/tidb-cloud/cli-reference.md).
77-
7870
## Export options
7971

8072
### Data filtering
@@ -92,7 +84,7 @@ You can export data in the following formats:
9284
- `separator`: specify the character used to separate fields in the exported data. The default separator is `,`.
9385
- `header`: specify whether to include a header row in the exported data. The default value is `true`.
9486
- `null-value`: specify the string that represents a NULL value in the exported data. The default value is `\N`.
95-
- `Parquet`: export data in Parquet format. Currently, it is only supported in TiDB Cloud CLI.
87+
- `Parquet`: export data in Parquet format.
9688

9789
The schema and data are exported according to the following naming conventions:
9890

@@ -181,7 +173,7 @@ When exporting data to the Parquet format, the data conversion between TiDB Clou
181173

182174
- **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
183175
- **Exported Data**: choose the databases and tables you want to export.
184-
- **Data Format**: choose **SQL File** or **CSV**.
176+
- **Data Format**: choose **SQL**, **CSV**, or **Parquet**.
185177
- **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
186178

187179
> **Tip:**
@@ -232,12 +224,12 @@ When exporting data to the Parquet format, the data conversion between TiDB Clou
232224

233225
- **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
234226
- **Exported Data**: choose the databases and tables you want to export.
235-
- **Data Format**: choose **SQL File** or **CSV**.
227+
- **Data Format**: choose **SQL**, **CSV**, or **Parquet**.
236228
- **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
237229
- **Folder URI**: enter the URI of the Amazon S3 with the `s3://<bucket-name>/<folder-path>/` format.
238-
- **Bucket Access**: choose one of the following access credentials and then fill in the credential information. If you do not have such information, see [Configure External Storage Access for TiDB Cloud Serverless](/tidb-cloud/serverless-external-storage.md#configure-amazon-s3-access).
239-
- **AWS Role ARN**: enter the role ARN that has the `s3:PutObject` and `s3:ListBucket` permissions to access the bucket.
240-
- **AWS Access Key**: enter the access key ID and access key secret that have the `s3:PutObject` and `s3:ListBucket` permissions to access the bucket.
230+
- **Bucket Access**: choose one of the following access credentials and then fill in the credential information:
231+
- **AWS Role ARN**: enter the role ARN that has the permission to access the bucket. It is recommended to create the role ARN with AWS CloudFormation. For more information, see [Configure External Storage Access for TiDB Cloud Serverless](/tidb-cloud/serverless-external-storage.md#configure-amazon-s3-access).
232+
- **AWS Access Key**: enter the access key ID and access key secret that have the permission to access the bucket.
241233

242234
4. Click **Export**.
243235

@@ -246,38 +238,95 @@ When exporting data to the Parquet format, the data conversion between TiDB Clou
246238
<div label="CLI">
247239

248240
```shell
249-
ticloud serverless export create -c <cluster-id> --s3.uri <uri> --s3.access-key-id <access-key-id> --s3.secret-access-key <secret-access-key> --filter "database.table"
241+
ticloud serverless export create -c <cluster-id> --target-type S3 --s3.uri <uri> --s3.access-key-id <access-key-id> --s3.secret-access-key <secret-access-key> --filter "database.table"
242+
243+
ticloud serverless export create -c <cluster-id> --target-type S3 --s3.uri <uri> --s3.role-arn <role-arn> --filter "database.table"
250244
```
251245

252-
- `s3.uri`: the Amazon S3 URI with the `s3://<bucket-name>/<file-path>` format.
246+
- `s3.uri`: the Amazon S3 URI with the `s3://<bucket-name>/<folder-path>/` format.
253247
- `s3.access-key-id`: the access key ID of the user who has the permission to access the bucket.
254248
- `s3.secret-access-key`: the access key secret of the user who has the permission to access the bucket.
249+
- `s3.role-arn`: the role ARN that has the permission to access the bucket.
255250

256251
</div>
257252
</SimpleTab>
258253

259254
### Export data to Google Cloud Storage
260255

261-
Currently, you can only export data to Google Cloud Storage using [TiDB Cloud CLI](/tidb-cloud/cli-reference.md).
256+
<SimpleTab>
257+
<div label="Console">
258+
259+
1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**Clusters**](https://tidbcloud.com/console/clusters) page of your project.
260+
261+
> **Tip:**
262+
>
263+
> If you have multiple projects, you can click <MDSvgIcon name="icon-left-projects" /> in the lower-left corner and switch to another project.
264+
265+
2. Click the name of your target cluster to go to its overview page, and then click **Import** in the left navigation pane.
266+
267+
3. On the **Import** page, click **Export Data to** in the upper-right corner, and then choose **Google Cloud Storage** from the drop-down list. Fill in the following parameters:
268+
269+
- **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
270+
- **Exported Data**: choose the databases and tables you want to export.
271+
- **Data Format**: choose **SQL**, **CSV**, or **Parquet**.
272+
- **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
273+
- **Folder URI**: enter the URI of the Google Cloud Storage with the `gs://<bucket-name>/<folder-path>/` format.
274+
- **Bucket Access**: upload the Google Cloud credentials file that has permission to access the bucket.
275+
276+
4. Click **Export**.
277+
278+
</div>
279+
280+
<div label="CLI">
262281

263282
```shell
264-
ticloud serverless export create -c <cluster-id> --gcs.uri <uri> --gcs.service-account-key <service-account-key> --filter "database.table"
283+
ticloud serverless export create -c <cluster-id> --target-type GCS --gcs.uri <uri> --gcs.service-account-key <service-account-key> --filter "database.table"
265284
```
266285

267-
- `gcs.uri`: the URI of the Google Cloud Storage bucket in the `gs://<bucket-name>/<file-path>` format.
286+
- `gcs.uri`: the URI of the Google Cloud Storage bucket in the `gs://<bucket-name>/<folder-path>/` format.
268287
- `gcs.service-account-key`: the base64 encoded service account key.
269288

289+
</div>
290+
</SimpleTab>
291+
270292
### Export data to Azure Blob Storage
271293

272-
Currently, you can only export data to Azure Blob Storage using [TiDB Cloud CLI](/tidb-cloud/cli-reference.md).
294+
<SimpleTab>
295+
<div label="Console">
296+
297+
1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**Clusters**](https://tidbcloud.com/console/clusters) page of your project.
298+
299+
> **Tip:**
300+
>
301+
> If you have multiple projects, you can click <MDSvgIcon name="icon-left-projects" /> in the lower-left corner and switch to another project.
302+
303+
2. Click the name of your target cluster to go to its overview page, and then click **Import** in the left navigation pane.
304+
305+
3. On the **Import** page, click **Export Data to** in the upper-right corner, and then choose **Azure Blob Storage** from the drop-down list. Fill in the following parameters:
306+
307+
- **Task Name**: enter a name for the export task. The default value is `SNAPSHOT_{snapshot_time}`.
308+
- **Exported Data**: choose the databases and tables you want to export.
309+
- **Data Format**: choose **SQL**, **CSV**, or **Parquet**.
310+
- **Compression**: choose **Gzip**, **Snappy**, **Zstd**, or **None**.
311+
- **Folder URI**: enter the URI of Azure Blob Storage with the `azure://<account-name>.blob.core.windows.net/<container-name>/<folder-path>/` format.
312+
- **SAS Token**: enter the SAS token that has the permission to access the container. It is recommended to create a SAS token with the [Azure ARM template](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/). For more information, see [Configure External Storage Access for TiDB Cloud Serverless](/tidb-cloud/serverless-external-storage.md#configure-azure-blob-storage-access).
313+
314+
4. Click **Export**.
315+
316+
</div>
317+
318+
<div label="CLI">
273319

274320
```shell
275-
ticloud serverless export create -c <cluster-id> --azblob.uri <uri> --azblob.sas-token <sas-token> --filter "database.table"
321+
ticloud serverless export create -c <cluster-id> --target-type AZURE_BLOB --azblob.uri <uri> --azblob.sas-token <sas-token> --filter "database.table"
276322
```
277323

278-
- `azblob.uri`: the URI of the Azure Blob Storage in the `azure://<account-name>.blob.core.windows.net/<container-name>/<file-path>` format.
324+
- `azblob.uri`: the URI of the Azure Blob Storage in the `(azure|https)://<account-name>.blob.core.windows.net/<container-name>/<folder-path>/` format.
279325
- `azblob.sas-token`: the account SAS token of the Azure Blob Storage.
280326

327+
</div>
328+
</SimpleTab>
329+
281330
### Cancel an export task
282331

283332
To cancel an ongoing export task, take the following steps:

tidb-cloud/serverless-external-storage.md

Lines changed: 40 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -199,14 +199,49 @@ Take the following steps to configure a service account key:
199199

200200
![service-account-key](/media/tidb-cloud/serverless-external-storage/gcs-service-account-key.png)
201201

202-
3. Choose the default `JSON` key type, and then click the **CREATE** button to download the service account key.
202+
3. Choose the default `JSON` key type, and then click **CREATE** to download the Google Cloud credentials file. The file contains the service account key that you need to use when configuring the GCS access for the TiDB Cloud Serverless cluster.
203203

204204
## Configure Azure Blob Storage access
205205

206-
To allow TiDB Serverless to access your Azure Blob container, you need to configure the Azure Blob access for the container. You can use a service SAS token to configure the container access:
206+
To allow TiDB Serverless to access your Azure Blob container, you need to create a service SAS token for the container.
207207

208-
1. On the [Azure Storage account](https://portal.azure.com/#browse/Microsoft.Storage%2FStorageAccounts) page, click your storage account to which the container belongs.
208+
You can create a SAS token either using an [Azure ARM template](https://learn.microsoft.com/en-us/azure/azure-resource-manager/templates/overview) (recommended) or manual configuration.
209+
210+
To create a SAS token using an Azure ARM template, take the following steps:
211+
212+
1. Open the **Import** page for your target cluster.
209213

214+
1. Log in to the [TiDB Cloud console](https://tidbcloud.com/) and navigate to the [**Clusters**](https://tidbcloud.com/console/clusters) page of your project.
215+
216+
2. Click the name of your target cluster to go to its overview page, and then click **Import** in the left navigation pane.
217+
218+
2. Open the **Generate New SAS Token via ARM Template Deployment** dialog.
219+
220+
1. Click **Export data to...** > **Azure Blob Storage**. If your cluster has neither imported nor exported any data before, click **Click here to export data to...** > **Azure Blob Storage** at the bottom of the page.
221+
222+
2. Scroll down to the **Azure Blob Storage Settings** area, and then click **Click here to create a new one with Azure ARM template** under the SAS Token field.
223+
224+
3. Create a SAS token with the Azure ARM template.
225+
226+
1. In the **Generate New SAS Token via ARM Template Deployment** dialog, click **Click to open the Azure Portal with the pre-configured ARM template**.
227+
228+
2. After logging in to Azure, you will be redirected to the Azure **Custom deployment** page.
229+
230+
3. Fill in the **Resource group** and **Storage Account Name** in the **Custom deployment** page. You can get all the information from the storage account overview page where the container is located.
231+
232+
![azure-storage-account-overview](/media/tidb-cloud/serverless-external-storage/azure-storage-account-overview.png)
233+
234+
4. Click **Review + create** or **Next** to review the deployment. Click **Create** to start the deployment.
235+
236+
5. After it completes, you will be redirected to the deployment overview page. Navigate to the **Outputs** section to get the SAS token.
237+
238+
If you have any trouble creating a SAS token with the Azure ARM template, take the following steps to create one manually:
239+
240+
<details>
241+
<summary>Click here to see details</summary>
242+
243+
1. On the [Azure Storage account](https://portal.azure.com/#browse/Microsoft.Storage%2FStorageAccounts) page, click your storage account to which the container belongs.
244+
210245
2. On your **Storage account** page, click the **Security+network**, and then click **Shared access signature**.
211246

212247
![sas-position](/media/tidb-cloud/serverless-external-storage/azure-sas-position.png)
@@ -222,3 +257,5 @@ To allow TiDB Serverless to access your Azure Blob container, you need to config
222257
![sas-create](/media/tidb-cloud/serverless-external-storage/azure-sas-create.png)
223258

224259
4. Click **Generate SAS and connection string** to generate the SAS token.
260+
261+
</details>

0 commit comments

Comments
 (0)