From 5ce175cc1ce4543c1b95c1583758fb369225944b Mon Sep 17 00:00:00 2001 From: Grace Cai Date: Sat, 12 Oct 2024 10:21:32 +0800 Subject: [PATCH] rename config-s3-and-gcs-access.md to dedicated-external-storage.md --- TOC-tidb-cloud.md | 2 +- ...s3-and-gcs-access.md => dedicated-external-storage.md} | 1 + tidb-cloud/import-csv-files.md | 8 ++++---- tidb-cloud/import-parquet-files.md | 8 ++++---- tidb-cloud/import-sample-data.md | 2 +- tidb-cloud/migrate-sql-shards.md | 2 +- tidb-cloud/release-notes-2023.md | 2 +- tidb-cloud/serverless-external-storage.md | 2 +- tidb-cloud/terraform-use-import-resource.md | 2 +- tidb-cloud/tidb-cloud-migration-overview.md | 2 +- tidb-cloud/troubleshoot-import-access-denied-error.md | 2 +- 11 files changed, 17 insertions(+), 16 deletions(-) rename tidb-cloud/{config-s3-and-gcs-access.md => dedicated-external-storage.md} (99%) diff --git a/TOC-tidb-cloud.md b/TOC-tidb-cloud.md index 7411570da4499..986a125213738 100644 --- a/TOC-tidb-cloud.md +++ b/TOC-tidb-cloud.md @@ -232,7 +232,7 @@ - [Import Apache Parquet Files from Amazon S3 or GCS](/tidb-cloud/import-parquet-files.md) - [Import with MySQL CLI](/tidb-cloud/import-with-mysql-cli.md) - Reference - - [Configure Amazon S3 Access and GCS Access](/tidb-cloud/config-s3-and-gcs-access.md) + - [Configure Amazon S3 Access and GCS Access](/tidb-cloud/dedicated-external-storage.md) - [Naming Conventions for Data Import](/tidb-cloud/naming-conventions-for-data-import.md) - [CSV Configurations for Importing Data](/tidb-cloud/csv-config-for-import-data.md) - [Troubleshoot Access Denied Errors during Data Import from Amazon S3](/tidb-cloud/troubleshoot-import-access-denied-error.md) diff --git a/tidb-cloud/config-s3-and-gcs-access.md b/tidb-cloud/dedicated-external-storage.md similarity index 99% rename from tidb-cloud/config-s3-and-gcs-access.md rename to tidb-cloud/dedicated-external-storage.md index fa19d65fd3943..749d2c9b890ae 100644 --- a/tidb-cloud/config-s3-and-gcs-access.md +++ b/tidb-cloud/dedicated-external-storage.md @@ -1,6 +1,7 @@ --- title: Configure External Storage Access for TiDB Cloud Dedicated summary: Learn how to configure Amazon Simple Storage Service (Amazon S3) access and Google Cloud Storage (GCS) access. +aliases: ['/tidb-cloud/config-s3-and-gcs-access'] --- # Configure External Storage Access for TiDB Cloud Dedicated diff --git a/tidb-cloud/import-csv-files.md b/tidb-cloud/import-csv-files.md index 04ee55f9b012f..45281e4b2fe62 100644 --- a/tidb-cloud/import-csv-files.md +++ b/tidb-cloud/import-csv-files.md @@ -80,11 +80,11 @@ Because CSV files do not contain schema information, before importing data from To allow TiDB Cloud to access the CSV files in the Amazon S3 or GCS bucket, do one of the following: -- If your CSV files are located in Amazon S3, [configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access). +- If your CSV files are located in Amazon S3, [configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access). You can use either an AWS access key or a Role ARN to access your bucket. Once finished, make a note of the access key (including the access key ID and secret access key) or the Role ARN value as you will need it in [Step 4](#step-4-import-csv-files-to-tidb-cloud). -- If your CSV files are located in GCS, [configure GCS access](/tidb-cloud/config-s3-and-gcs-access.md#configure-gcs-access). +- If your CSV files are located in GCS, [configure GCS access](/tidb-cloud/dedicated-external-storage.md#configure-gcs-access). ## Step 4. Import CSV files to TiDB Cloud @@ -115,7 +115,7 @@ To import the CSV files to TiDB Cloud, take the following steps: - **File URI** or **Folder URI**: - When importing one file, enter the source file URI and name in the following format `s3://[bucket_name]/[data_source_folder]/[file_name].csv`. For example, `s3://sampledata/ingest/TableName.01.csv`. - When importing multiple files, enter the source file URI and name in the following format `s3://[bucket_name]/[data_source_folder]/`. For example, `s3://sampledata/ingest/`. - - **Bucket Access**: you can use either an AWS Role ARN or an AWS access key to access your bucket. For more information, see [Configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access). + - **Bucket Access**: you can use either an AWS Role ARN or an AWS access key to access your bucket. For more information, see [Configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access). - **AWS Role ARN**: enter the AWS Role ARN value. - **AWS Access Key**: enter the AWS access key ID and AWS secret access key. @@ -169,7 +169,7 @@ To import the CSV files to TiDB Cloud, take the following steps: - **File URI** or **Folder URI**: - When importing one file, enter the source file URI and name in the following format `gs://[bucket_name]/[data_source_folder]/[file_name].csv`. For example, `gs://sampledata/ingest/TableName.01.csv`. - When importing multiple files, enter the source file URI and name in the following format `gs://[bucket_name]/[data_source_folder]/`. For example, `gs://sampledata/ingest/`. - - **Bucket Access**: you can use a GCS IAM Role to access your bucket. For more information, see [Configure GCS access](/tidb-cloud/config-s3-and-gcs-access.md#configure-gcs-access). + - **Bucket Access**: you can use a GCS IAM Role to access your bucket. For more information, see [Configure GCS access](/tidb-cloud/dedicated-external-storage.md#configure-gcs-access). 4. Click **Connect**. diff --git a/tidb-cloud/import-parquet-files.md b/tidb-cloud/import-parquet-files.md index db9bd94a49eb4..23ba4d280e62c 100644 --- a/tidb-cloud/import-parquet-files.md +++ b/tidb-cloud/import-parquet-files.md @@ -86,11 +86,11 @@ Because Parquet files do not contain schema information, before importing data f To allow TiDB Cloud to access the Parquet files in the Amazon S3 or GCS bucket, do one of the following: -- If your Parquet files are located in Amazon S3, [configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access). +- If your Parquet files are located in Amazon S3, [configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access). You can use either an AWS access key or a Role ARN to access your bucket. Once finished, make a note of the access key (including the access key ID and secret access key) or the Role ARN value as you will need it in [Step 4](#step-4-import-parquet-files-to-tidb-cloud). -- If your Parquet files are located in GCS, [configure GCS access](/tidb-cloud/config-s3-and-gcs-access.md#configure-gcs-access). +- If your Parquet files are located in GCS, [configure GCS access](/tidb-cloud/dedicated-external-storage.md#configure-gcs-access). ## Step 4. Import Parquet files to TiDB Cloud @@ -121,7 +121,7 @@ To import the Parquet files to TiDB Cloud, take the following steps: - **File URI** or **Folder URI**: - When importing one file, enter the source file URI and name in the following format `s3://[bucket_name]/[data_source_folder]/[file_name].parquet`. For example, `s3://sampledata/ingest/TableName.01.parquet`. - When importing multiple files, enter the source file URI and name in the following format `s3://[bucket_name]/[data_source_folder]/`. For example, `s3://sampledata/ingest/`. - - **Bucket Access**: you can use either an AWS Role ARN or an AWS access key to access your bucket. For more information, see [Configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access). + - **Bucket Access**: you can use either an AWS Role ARN or an AWS access key to access your bucket. For more information, see [Configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access). - **AWS Role ARN**: enter the AWS Role ARN value. - **AWS Access Key**: enter the AWS access key ID and AWS secret access key. @@ -175,7 +175,7 @@ To import the Parquet files to TiDB Cloud, take the following steps: - **File URI** or **Folder URI**: - When importing one file, enter the source file URI and name in the following format `gs://[bucket_name]/[data_source_folder]/[file_name].parquet`. For example, `gs://sampledata/ingest/TableName.01.parquet`. - When importing multiple files, enter the source file URI and name in the following format `gs://[bucket_name]/[data_source_folder]/`. For example, `gs://sampledata/ingest/`. - - **Bucket Access**: you can use a GCS IAM Role to access your bucket. For more information, see [Configure GCS access](/tidb-cloud/config-s3-and-gcs-access.md#configure-gcs-access). + - **Bucket Access**: you can use a GCS IAM Role to access your bucket. For more information, see [Configure GCS access](/tidb-cloud/dedicated-external-storage.md#configure-gcs-access). 4. Click **Connect**. diff --git a/tidb-cloud/import-sample-data.md b/tidb-cloud/import-sample-data.md index 62d6d8ddd2732..7f1556bd443d6 100644 --- a/tidb-cloud/import-sample-data.md +++ b/tidb-cloud/import-sample-data.md @@ -65,7 +65,7 @@ This document describes how to import the sample data into TiDB Cloud via the UI - To import into pre-created tables, select **No**. This enables you to create tables in TiDB in advance and select the tables that you want to import data into. In this case, you can choose up to 1000 tables to import. You can click **SQL Editor** in the left navigation pane to create tables. For more information about how to use SQL Editor, see [Explore your data with AI-assisted SQL Editor](/tidb-cloud/explore-data-with-chat2query.md). - **Data Format**: select **SQL**. TiDB Cloud supports importing compressed files in the following formats: `.gzip`, `.gz`, `.zstd`, `.zst` and `.snappy`. If you want to import compressed SQL files, name the files in the `${db_name}.${table_name}.${suffix}.sql.${compress}` format, in which `${suffix}` is optional and can be any integer such as '000001'. For example, if you want to import the `trips.000001.sql.gz` file to the `bikeshare.trips` table, you can rename the file as `bikeshare.trips.000001.sql.gz`. Note that you only need to compress the data files, not the database or table schema files. Note that you only need to compress the data files, not the database or table schema files. The Snappy compressed file must be in the [official Snappy format](https://github.com/google/snappy). Other variants of Snappy compression are not supported. - **Folder URI** or **File URI**: enter the sample data URI `gs://tidbcloud-samples-us-west1/`. - - **Bucket Access**: you can use a GCS IAM Role to access your bucket. For more information, see [Configure GCS access](/tidb-cloud/config-s3-and-gcs-access.md#configure-gcs-access). + - **Bucket Access**: you can use a GCS IAM Role to access your bucket. For more information, see [Configure GCS access](/tidb-cloud/dedicated-external-storage.md#configure-gcs-access). If the region of the bucket is different from your cluster, confirm the compliance of cross region. diff --git a/tidb-cloud/migrate-sql-shards.md b/tidb-cloud/migrate-sql-shards.md index c424d6c9a8af7..a3cb167fd08d4 100644 --- a/tidb-cloud/migrate-sql-shards.md +++ b/tidb-cloud/migrate-sql-shards.md @@ -138,7 +138,7 @@ For more information about the solutions to solve such conflicts, see [Remove th ### Step 4. Configure Amazon S3 access -Follow the instructions in [Configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access) to get the role ARN to access the source data. +Follow the instructions in [Configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access) to get the role ARN to access the source data. The following example only lists key policy configurations. Replace the Amazon S3 path with your own values. diff --git a/tidb-cloud/release-notes-2023.md b/tidb-cloud/release-notes-2023.md index f22f8821c3a15..aca8fb88f4c3c 100644 --- a/tidb-cloud/release-notes-2023.md +++ b/tidb-cloud/release-notes-2023.md @@ -871,7 +871,7 @@ This page lists the release notes of [TiDB Cloud](https://www.pingcap.com/tidb-c - Support using the AWS access keys of an IAM user to access your Amazon S3 bucket when importing data to TiDB Cloud. - This method is simpler than using Role ARN. For more information, refer to [Configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access). + This method is simpler than using Role ARN. For more information, refer to [Configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access). - Extend the [monitoring metrics retention period](/tidb-cloud/built-in-monitoring.md#metrics-retention-policy) from 2 days to a longer period: diff --git a/tidb-cloud/serverless-external-storage.md b/tidb-cloud/serverless-external-storage.md index a17a130528016..841c7543b80ab 100644 --- a/tidb-cloud/serverless-external-storage.md +++ b/tidb-cloud/serverless-external-storage.md @@ -7,7 +7,7 @@ summary: Learn how to configure Amazon Simple Storage Service (Amazon S3) access If you want to import data from or export data to an external storage in a TiDB Cloud Serverless cluster, you need to configure cross-account access. This document describes how to configure access to an external storage for TiDB Cloud Serverless clusters. -If you need to configure these external storages for a TiDB Cloud Dedicated cluster, see [Configure External Storage Access for TiDB Cloud Dedicated](/tidb-cloud/config-s3-and-gcs-access.md). +If you need to configure these external storages for a TiDB Cloud Dedicated cluster, see [Configure External Storage Access for TiDB Cloud Dedicated](/tidb-cloud/dedicated-external-storage.md). ## Configure Amazon S3 access diff --git a/tidb-cloud/terraform-use-import-resource.md b/tidb-cloud/terraform-use-import-resource.md index 3b75f6d3d92d7..416ff55f5dda9 100644 --- a/tidb-cloud/terraform-use-import-resource.md +++ b/tidb-cloud/terraform-use-import-resource.md @@ -189,7 +189,7 @@ You can manage either a local import task or an Amazon S3 import task using the > **Note:** > -> To allow TiDB Cloud to access your files in the Amazon S3 bucket, you need to [configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access) first. +> To allow TiDB Cloud to access your files in the Amazon S3 bucket, you need to [configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access) first. 1. Create an `import` directory, and then create a `main.tf` inside it. For example: diff --git a/tidb-cloud/tidb-cloud-migration-overview.md b/tidb-cloud/tidb-cloud-migration-overview.md index e6dfcd50e4bb3..e91da03d21ad7 100644 --- a/tidb-cloud/tidb-cloud-migration-overview.md +++ b/tidb-cloud/tidb-cloud-migration-overview.md @@ -55,7 +55,7 @@ If you have data files in SQL, CSV, Parquet, or Aurora Snapshot formats, you can ### Configure Amazon S3 access and GCS access -If your source data is stored in Amazon S3 or Google Cloud Storage (GCS) buckets, before importing or migrating the data to TiDB Cloud, you need to configure access to the buckets. For more information, see [Configure Amazon S3 access and GCS access](/tidb-cloud/config-s3-and-gcs-access.md). +If your source data is stored in Amazon S3 or Google Cloud Storage (GCS) buckets, before importing or migrating the data to TiDB Cloud, you need to configure access to the buckets. For more information, see [Configure Amazon S3 access and GCS access](/tidb-cloud/dedicated-external-storage.md). ### Naming conventions for data import diff --git a/tidb-cloud/troubleshoot-import-access-denied-error.md b/tidb-cloud/troubleshoot-import-access-denied-error.md index d418157f2f5cd..dff1a27b43a28 100644 --- a/tidb-cloud/troubleshoot-import-access-denied-error.md +++ b/tidb-cloud/troubleshoot-import-access-denied-error.md @@ -50,7 +50,7 @@ In the sample trust entity: ### Check whether the IAM role exists -If the IAM role does not exist, create a role following instructions in [Configure Amazon S3 access](/tidb-cloud/config-s3-and-gcs-access.md#configure-amazon-s3-access). +If the IAM role does not exist, create a role following instructions in [Configure Amazon S3 access](/tidb-cloud/dedicated-external-storage.md#configure-amazon-s3-access). ### Check whether the external ID is set correctly