You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: tidb-cloud/serverless-export.md
+18-10Lines changed: 18 additions & 10 deletions
Original file line number
Diff line number
Diff line change
@@ -36,22 +36,30 @@ Exporting data to local file has the following limitations:
36
36
37
37
### Amazon S3
38
38
39
-
To export data to Amazon S3, you need to provide one of the following access methods for your Amazon S3 bucket:
39
+
To export data to Amazon S3, you need to provide the following information:
40
40
41
-
-[Access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html): make sure the access key has these permissions: `s3:PutObject` and `s3:ListBucket`.
42
-
-[Role arn](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html): make sure the role arn has these permissions: `s3:PutObject` and `s3:ListBucket`.
41
+
- uri: `s3://<bucket-name>/<file-path>`
42
+
- one of the following access methods:
43
+
-[access key](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html): make sure the access key has the `s3:PutObject` and `s3:ListBucket` permissions.
44
+
-[role arn](https://docs.aws.amazon.com/IAM/latest/UserGuide/reference-arns.html): make sure the role arn has the `s3:PutObject` and `s3:ListBucket` permissions.
43
45
44
46
### Google Cloud Storage
45
47
46
-
To export data to Google Cloud Storage, you need to provide a **base64 encoded**[service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) for your Google Cloud Storage bucket. Make sure the service account key has these permissions: `storage.objects.create`. You may also need `storage.objects.delete` permission when you export to a non-empty folder.
48
+
To export data to Google Cloud Storage, you need to provide the following information:
49
+
50
+
- uri: `gs://<bucket-name>/<file-path>`
51
+
- access method: a **base64 encoded**[service account key](https://cloud.google.com/iam/docs/creating-managing-service-account-keys) for your bucket. Make sure the service account key has the `storage.objects.create` permission.
47
52
48
53
> **Note:**
49
54
>
50
55
> Only supported in TiDB Cloud CLI now.
51
56
52
57
### Azure Blob Storage
53
58
54
-
To export data to Azure Blob Storage, you need to provide a [shared access signature (SAS) token](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview) for your Azure Blob Storage container. Make sure the SAS token has the `Read` and `Write` permissions on the `Container` and `Object` resources.
59
+
To export data to Azure Blob Storage, you need to provide the following information:
- access method: a [shared access signature (SAS) token](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview) for your Azure Blob Storage container. Make sure the SAS token has the `Read` and `Write` permissions on the `Container` and `Object` resources.
55
63
56
64
> **Note:**
57
65
>
@@ -177,7 +185,7 @@ You can compress the exported parquet data using the following algorithms:
177
185
-**Exported data**: choose the databases and tables you want to export.
178
186
-**Data format**: choose one of the **SQL File** and **CSV**.
179
187
-**Compression**: choose one of the **Gzip**, **Snappy**, **Zstd**, and **None**.
180
-
-**File URI**: enter the URI of the Amazon S3.
188
+
-**File URI**: enter the URI of the Amazon S3 with the `s3://<bucket-name>/<file-path>` format.
181
189
-**Bucket Access**
182
190
-**AWS Role Arn**: enter the ARN of the role that has the permission to access the bucket.
183
191
-**AWS Access Key ID**: enter the access key ID and access key secret that has the permission to access the bucket.
@@ -194,15 +202,15 @@ You can compress the exported parquet data using the following algorithms:
0 commit comments