From 5f9fa9087d5f8ff15fae2699a518e22d6e095e29 Mon Sep 17 00:00:00 2001 From: Victor Engmark Date: Fri, 4 Oct 2024 10:49:49 +1300 Subject: [PATCH] docs: Show the max savings of compressing JSON files --- docs/GeoJSON-compression.md | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) diff --git a/docs/GeoJSON-compression.md b/docs/GeoJSON-compression.md index b3ef5e8f..d1f085e8 100644 --- a/docs/GeoJSON-compression.md +++ b/docs/GeoJSON-compression.md @@ -18,4 +18,6 @@ Contra compression: - [AWS CLI issue](https://github.com/aws/aws-cli/issues/6765) - [boto3 issue](https://github.com/boto/botocore/issues/1255) - Any files on S3 "[smaller than 128 KB](https://aws.amazon.com/s3/pricing/)" (presumably actually 128 KiB) are treated as being 128 KB for pricing purposes, so there would be no price gain from compressing any files which are smaller than this -- The extra development time to deal with compressing and decompressing would probably not offset the savings +- The extra development time to deal with compressing and decompressing JSON files larger than 128 KB would not offset the savings: + - We can get the sizes of JSON files by running `aws s3api list-objects-v2 --bucket=nz-elevation --no-sign-request --query="Contents[?ends_with(Key, 'json')].Size"` and `aws s3api list-objects-v2 --bucket=nz-imagery --no-sign-request --query="Contents[?ends_with(Key, 'json')].Size"` + - Summing up the sizes of files larger than 128 KB we get a total of only _33 MB_ at time of writing