Skip to content

Commit c3d89a5

Browse files
authored
updated SQL for DATA_LENGTH (#15432) (#15434)
1 parent aad5952 commit c3d89a5

File tree

3 files changed

+79
-32
lines changed

3 files changed

+79
-32
lines changed

dm/dm-hardware-and-software-requirements.md

Lines changed: 28 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -56,18 +56,31 @@ The target TiKV cluster must have enough disk space to store the imported data.
5656

5757
You can estimate the data volume by using the following SQL statements to summarize the `DATA_LENGTH` field:
5858

59-
- Calculate the size of all schemas, in MiB. Replace `${schema_name}` with your schema name.
60-
61-
{{< copyable "sql" >}}
62-
63-
```sql
64-
select table_schema,sum(data_length)/1024/1024 as data_length,sum(index_length)/1024/1024 as index_length,sum(data_length+index_length)/1024/1024 as sum from information_schema.tables where table_schema = "${schema_name}" group by table_schema;
65-
```
66-
67-
- Calculate the size of the largest table, in MiB. Replace ${schema_name} with your schema name.
68-
69-
{{< copyable "sql" >}}
70-
71-
```sql
72-
select table_name,table_schema,sum(data_length)/1024/1024 as data_length,sum(index_length)/1024/1024 as index_length,sum(data_length+index_length)/1024/1024 as sum from information_schema.tables where table_schema = "${schema_name}" group by table_name,table_schema order by sum desc limit 5;
73-
```
59+
```sql
60+
-- Calculate the size of all schemas
61+
SELECT
62+
TABLE_SCHEMA,
63+
FORMAT_BYTES(SUM(DATA_LENGTH)) AS 'Data Size',
64+
FORMAT_BYTES(SUM(INDEX_LENGTH)) 'Index Size'
65+
FROM
66+
information_schema.tables
67+
GROUP BY
68+
TABLE_SCHEMA;
69+
70+
-- Calculate the 5 largest tables
71+
SELECT
72+
TABLE_NAME,
73+
TABLE_SCHEMA,
74+
FORMAT_BYTES(SUM(data_length)) AS 'Data Size',
75+
FORMAT_BYTES(SUM(index_length)) AS 'Index Size',
76+
FORMAT_BYTES(SUM(data_length+index_length)) AS 'Total Size'
77+
FROM
78+
information_schema.tables
79+
GROUP BY
80+
TABLE_NAME,
81+
TABLE_SCHEMA
82+
ORDER BY
83+
SUM(DATA_LENGTH+INDEX_LENGTH) DESC
84+
LIMIT
85+
5;
86+
```

migrate-large-mysql-to-tidb.md

Lines changed: 26 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -31,14 +31,33 @@ This document describes how to perform the full migration using Dumpling and TiD
3131

3232
**Note**: It is difficult to calculate the exact data volume exported by Dumpling from MySQL, but you can estimate the data volume by using the following SQL statement to summarize the `DATA_LENGTH` field in the `information_schema.tables` table:
3333

34-
{{< copyable "" >}}
35-
3634
```sql
37-
/* Calculate the size of all schemas, in MiB. Replace ${schema_name} with your schema name. */
38-
SELECT table_schema,SUM(data_length)/1024/1024 AS data_length,SUM(index_length)/1024/1024 AS index_length,SUM(data_length+index_length)/1024/1024 AS SUM FROM information_schema.tables WHERE table_schema = "${schema_name}" GROUP BY table_schema;
39-
40-
/* Calculate the size of the largest table, in MiB. Replace ${schema_name} with your schema name. */
41-
SELECT table_name,table_schema,SUM(data_length)/1024/1024 AS data_length,SUM(index_length)/1024/1024 AS index_length,SUM(data_length+index_length)/1024/1024 AS SUM from information_schema.tables WHERE table_schema = "${schema_name}" GROUP BY table_name,table_schema ORDER BY SUM DESC LIMIT 5;
35+
-- Calculate the size of all schemas
36+
SELECT
37+
TABLE_SCHEMA,
38+
FORMAT_BYTES(SUM(DATA_LENGTH)) AS 'Data Size',
39+
FORMAT_BYTES(SUM(INDEX_LENGTH)) 'Index Size'
40+
FROM
41+
information_schema.tables
42+
GROUP BY
43+
TABLE_SCHEMA;
44+
45+
-- Calculate the 5 largest tables
46+
SELECT
47+
TABLE_NAME,
48+
TABLE_SCHEMA,
49+
FORMAT_BYTES(SUM(data_length)) AS 'Data Size',
50+
FORMAT_BYTES(SUM(index_length)) AS 'Index Size',
51+
FORMAT_BYTES(SUM(data_length+index_length)) AS 'Total Size'
52+
FROM
53+
information_schema.tables
54+
GROUP BY
55+
TABLE_NAME,
56+
TABLE_SCHEMA
57+
ORDER BY
58+
SUM(DATA_LENGTH+INDEX_LENGTH) DESC
59+
LIMIT
60+
5;
4261
```
4362

4463
### Disk space for the target TiKV cluster

tidb-lightning/tidb-lightning-requirements.md

Lines changed: 25 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -86,16 +86,31 @@ The target TiKV cluster must have enough disk space to store the imported data.
8686

8787
It is difficult to calculate the exact data volume exported by Dumpling from MySQL. However, you can estimate the data volume by using the following SQL statement to summarize the `DATA_LENGTH` field in the information_schema.tables table:
8888

89-
Calculate the size of all schemas, in MiB. Replace ${schema_name} with your schema name.
90-
9189
```sql
92-
SELECT table_schema, SUM(data_length)/1024/1024 AS data_length, SUM(index_length)/1024/1024 AS index_length, SUM(data_length+index_length)/1024/1024 AS sum FROM information_schema.tables WHERE table_schema = "${schema_name}" GROUP BY table_schema;
93-
```
94-
95-
Calculate the size of the largest table, in MiB. Replace ${schema_name} with your schema name.
90+
-- Calculate the size of all schemas
91+
SELECT
92+
TABLE_SCHEMA,
93+
FORMAT_BYTES(SUM(DATA_LENGTH)) AS 'Data Size',
94+
FORMAT_BYTES(SUM(INDEX_LENGTH)) 'Index Size'
95+
FROM
96+
information_schema.tables
97+
GROUP BY
98+
TABLE_SCHEMA;
9699

97-
{{< copyable "sql" >}}
98-
99-
```sql
100-
SELECT table_name, table_schema, SUM(data_length)/1024/1024 AS data_length, SUM(index_length)/1024/1024 AS index_length,sum(data_length+index_length)/1024/1024 AS sum FROM information_schema.tables WHERE table_schema = "${schema_name}" GROUP BY table_name,table_schema ORDER BY sum DESC LIMIT 5;
100+
-- Calculate the 5 largest tables
101+
SELECT
102+
TABLE_NAME,
103+
TABLE_SCHEMA,
104+
FORMAT_BYTES(SUM(data_length)) AS 'Data Size',
105+
FORMAT_BYTES(SUM(index_length)) AS 'Index Size',
106+
FORMAT_BYTES(SUM(data_length+index_length)) AS 'Total Size'
107+
FROM
108+
information_schema.tables
109+
GROUP BY
110+
TABLE_NAME,
111+
TABLE_SCHEMA
112+
ORDER BY
113+
SUM(DATA_LENGTH+INDEX_LENGTH) DESC
114+
LIMIT
115+
5;
101116
```

0 commit comments

Comments
 (0)