Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scylla Manager Agent overwhelming Scylla Manager #3920

Closed
rwnd-bradley opened this issue Jul 7, 2024 · 6 comments
Closed

Scylla Manager Agent overwhelming Scylla Manager #3920

rwnd-bradley opened this issue Jul 7, 2024 · 6 comments

Comments

@rwnd-bradley
Copy link

rwnd-bradley commented Jul 7, 2024

Hi, I'm using the Scylla helm stack and have observed 10 requests per second per Scylla node to my nginx ingress, which is sitting in front of a Ceph S3 cluster. Based on the logs, it appears that the Scylla manager agent is constantly checking the availability of the S3 location, to the extent that it's achieving 25MB/s. Connections to 192.168.138.32 (scylla-manager-7b74b68c49-778ts pod) also frequently drop, likely due to the Scylla manager pod CPU maxing out.

I also notice "rclone" in the logs while I have only configured S3 in the scylla-manager-agent.yaml, defined in the scylla-manager-agent-config Scylla manager secret.

Scylla scylla-manager-agent pod:

{"L":"INFO","T":"2024-07-07T12:32:44.124Z","M":"http: TLS handshake error from 192.168.138.32:52390: EOF"}
{"L":"INFO","T":"2024-07-07T12:32:44.125Z","M":"http: TLS handshake error from 192.168.138.32:52410: read tcp 192.168.138.54:10001->192.168.138.32:52410: read: connection reset by peer"}
{"L":"INFO","T":"2024-07-07T12:32:44.125Z","M":"http: TLS handshake error from 192.168.138.32:52416: read tcp 192.168.138.54:10001->192.168.138.32:52416: read: connection reset by peer"}
{"L":"INFO","T":"2024-07-07T12:32:45.647Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T12:32:46.733Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T12:32:46.914Z","M":"http: TLS handshake error from 192.168.138.32:52638: EOF"}
{"L":"INFO","T":"2024-07-07T12:32:46.920Z","M":"http: TLS handshake error from 192.168.138.32:52662: EOF"}
{"L":"INFO","T":"2024-07-07T12:32:46.922Z","M":"http: TLS handshake error from 192.168.138.32:52746: EOF"}
{"L":"INFO","T":"2024-07-07T12:32:48.667Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T12:32:48.890Z","M":"http: TLS handshake error from 192.168.138.32:52874: read tcp 192.168.138.54:10001->192.168.138.32:52874: read: connection reset by peer"}
{"L":"INFO","T":"2024-07-07T12:32:48.890Z","M":"http: TLS handshake error from 192.168.138.32:52858: read tcp 192.168.138.54:10001->192.168.138.32:52858: read: connection reset by peer"}
{"L":"INFO","T":"2024-07-07T12:32:50.629Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T12:32:50.815Z","M":"http: TLS handshake error from 192.168.138.32:53022: EOF"}

Nginx Ingress:

xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "GET /scylla-backup/scylla-manager-agent-3303765332/test HTTP/1.1" 200 1 "-" "Scylla Manager Agent 3.3.0" 504 0.045 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 1 0.045 200 7bb5540496bce93d05c97c60f4c1b84d
192.168.79.61 - - [07/Jul/2024:12:38:04 +0000] "HEAD /scylla-backup/scylla-manager-agent-144953763 HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 476 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 5434ad5275da774d7b14816d4ed4a7e5
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "HEAD /scylla-backup/scylla-manager-agent-1400398288 HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 477 0.000 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 70c4977acae65f044480c0745612691d
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "HEAD /scylla-backup/scylla-manager-agent-3090407778 HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 477 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.002 404 48253a1f1821edc264d5f8b673857336
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "HEAD /scylla-backup/scylla-manager-agent-3303765332 HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 477 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 a93d9e6131b5167082e6e5fb8b9fc8ef
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "GET /scylla-backup?delimiter=&max-keys=1000&prefix=scylla-manager-agent-3090407778%2F HTTP/1.1" 200 607 "-" "Scylla Manager Agent 3.3.0" 534 0.043 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 607 0.043 200 ccb1cd1f0fcaa960e8f315289d1aa7f5
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "GET /scylla-backup?delimiter=&max-keys=1000&prefix=scylla-manager-agent-1400398288%2F HTTP/1.1" 200 607 "-" "Scylla Manager Agent 3.3.0" 534 0.047 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 607 0.047 200 ef0abf164821f315270d9315db09f4c8
192.168.79.61 - - [07/Jul/2024:12:38:04 +0000] "GET /scylla-backup?delimiter=&max-keys=1000&prefix=scylla-manager-agent-144953763%2F HTTP/1.1" 200 605 "-" "Scylla Manager Agent 3.3.0" 533 0.048 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 605 0.047 200 b119d6ab45ed8c75a2f56ece39845fe4
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "GET /scylla-backup?delimiter=&max-keys=1000&prefix=scylla-manager-agent-3303765332%2F HTTP/1.1" 200 607 "-" "Scylla Manager Agent 3.3.0" 534 0.046 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 607 0.046 200 f8df5aabfc83d5f6f94cfb27667b5f98
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "DELETE /scylla-backup/scylla-manager-agent-1400398288/test HTTP/1.1" 204 0 "-" "Scylla Manager Agent 3.3.0" 507 0.048 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.048 204 fcfea6981d60d1bf8fa92f57a6ea6f14
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "DELETE /scylla-backup/scylla-manager-agent-3090407778/test HTTP/1.1" 204 0 "-" "Scylla Manager Agent 3.3.0" 507 0.053 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.053 204 1a72c60c343de46785d44e5c9692222b
192.168.79.61 - - [07/Jul/2024:12:38:04 +0000] "DELETE /scylla-backup/scylla-manager-agent-144953763/test HTTP/1.1" 204 0 "-" "Scylla Manager Agent 3.3.0" 506 0.053 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.053 204 923296fcd2821416927cc0d843cb926a
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:04 +0000] "DELETE /scylla-backup/scylla-manager-agent-3303765332/test HTTP/1.1" 204 0 "-" "Scylla Manager Agent 3.3.0" 507 0.055 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.055 204 49f055fb163e4efe98f2e206976d2a54
192.168.79.61 - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-3978206805/test HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 482 0.002 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 d1a52f45aedb841d0dd3997ee76a91db
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-846896754/test HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 481 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 9a1b23a31b68455cf3655230dc564db1
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-1183589923/test HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 f760488f43626504c29070569b997437
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-516537781/test HTTP/1.1" 404 0 "-" "Scylla Manager Agent 3.3.0" 481 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 404 89ba2de0e0e04a9afa1fefd094ac1f4f
192.168.79.61 - - [07/Jul/2024:12:38:05 +0000] "PUT /scylla-backup/scylla-manager-agent-3978206805/test?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxxxx%2F20240707%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240707T123805Z&X-Amz-Expires=900&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost%3Bx-amz-acl%3Bx-amz-meta-mtime&X-Amz-Signature=f37ce828da3cadbf426c3bcf43bf261c3fa7c3fbf21a971fcf48616ca7253e74 HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 643 0.013 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.013 200 5abd9117fa758aee8670df4a7e6d6e17
192.168.79.61 - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-3978206805/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 dd760a204b1f3ff1504d64bee48dacc6
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "PUT /scylla-backup/scylla-manager-agent-516537781/test?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxxxx%2F20240707%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240707T123805Z&X-Amz-Expires=900&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost%3Bx-amz-acl%3Bx-amz-meta-mtime&X-Amz-Signature=e73ca4be3976b494e4b9e08fd4b876af326f26ae274dff9d0496e39d76e08111 HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 642 0.017 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.016 200 f6e86c906a539e063dc18d58420c253a
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "PUT /scylla-backup/scylla-manager-agent-846896754/test?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxxxx%2F20240707%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240707T123805Z&X-Amz-Expires=900&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost%3Bx-amz-acl%3Bx-amz-meta-mtime&X-Amz-Signature=83a1d87fd48798c6dcd15056ab982139b18c40ec093c02c32dbfa2f8cedadb79 HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 642 0.017 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.017 200 a3ef5852a2b252ba0c66e44773db879c
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-516537781/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 481 0.000 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.000 200 86f6e79d67fb687f79535b64f0e89d87
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-846896754/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 481 0.000 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.000 200 2242563f996686dd0350706a47af79e7
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "PUT /scylla-backup/scylla-manager-agent-1183589923/test?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=xxxxx%2F20240707%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20240707T123805Z&X-Amz-Expires=900&X-Amz-SignedHeaders=content-md5%3Bcontent-type%3Bhost%3Bx-amz-acl%3Bx-amz-meta-mtime&X-Amz-Signature=9ed6bd86c9e2f0adafdfaced7b2c4dbbcfb37e79e7c79526196c3c49e79d1941 HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 642 0.020 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.021 200 ebb087a854473a0d0ed1e98566379da8
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-1183589923/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 017fb0d3063e50019ed650f6716df755
192.168.79.61 - - [07/Jul/2024:12:38:05 +0000] "GET /scylla-backup?delimiter=%2F&max-keys=1000&prefix=scylla-manager-agent-3978206805%2F HTTP/1.1" 200 631 "-" "Scylla Manager Agent 3.3.0" 537 0.048 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 631 0.049 200 6b483a19d3a92a95311e662129f2ce9f
192.168.79.61 - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-3978206805/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 ac7529ba4c07b7b9f15aa96629500efe
192.168.79.61 - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-3978206805/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 3107815876f9f42c9526bc0c63393e03
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "GET /scylla-backup?delimiter=%2F&max-keys=1000&prefix=scylla-manager-agent-846896754%2F HTTP/1.1" 200 629 "-" "Scylla Manager Agent 3.3.0" 536 0.052 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 629 0.052 200 816cac1356c04b7ae61055c30754ab2d
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "GET /scylla-backup?delimiter=%2F&max-keys=1000&prefix=scylla-manager-agent-516537781%2F HTTP/1.1" 200 629 "-" "Scylla Manager Agent 3.3.0" 536 0.052 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 629 0.052 200 4167b1feb5fed4c27932dfc46b5cd9bc
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "GET /scylla-backup?delimiter=%2F&max-keys=1000&prefix=scylla-manager-agent-1183589923%2F HTTP/1.1" 200 631 "-" "Scylla Manager Agent 3.3.0" 537 0.048 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 631 0.048 200 b2185bb2e55edad371865c3d1c7756a3
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-846896754/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 481 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 bb8f5f20afdc1b0be360223bf365f249
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-1183589923/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 afbc3735965d03df5c36a3dcea535b8a
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-516537781/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 481 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 944b19648d5b360ca8ebf29c0a3c676b
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-846896754/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 481 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 0b37bb7408da3064ddaba13bb4307cd0
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-1183589923/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 482 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 43356432c41b1ae052d30eaa38f9db47
xxx.xxx.xxx.xxx - - [07/Jul/2024:12:38:05 +0000] "HEAD /scylla-backup/scylla-manager-agent-516537781/test HTTP/1.1" 200 0 "-" "Scylla Manager Agent 3.3.0" 481 0.001 [rook-ceph-rook-ceph-rgw-ceph-objectstore-http] [] 192.168.3.180:8080 0 0.001 200 5f090d1c914282a8535736114ce1a196

Is this intended behavior? I find it hard to believe that the location needs to be checked so frequently when my backup task is scheduled once every 4 hours.

scylla version: 6.0.1
scylla agent version: 3.3.0
scylla manager version: 3.3.0
scylla operator version: 1.13

@rwnd-bradley
Copy link
Author

Since increasing the CPU limits from 1000m to 4000m, pod monitoring shows it sits around 1200m, so "overwhelming" is slightly misleading. Regardless, I'm still observing errors in the Scylla manager agent and the high bandwidth usage remains.

{"L":"INFO","T":"2024-07-07T13:04:18.573Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T13:04:19.372Z","M":"http: TLS handshake error from 192.168.3.170:45592: EOF"}
{"L":"INFO","T":"2024-07-07T13:04:19.846Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T13:04:21.092Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T13:04:22.307Z","N":"rclone","M":"Location check done"}
{"L":"INFO","T":"2024-07-07T13:04:22.484Z","M":"http: TLS handshake error from 192.168.3.170:45912: read tcp 192.168.3.172:10001->192.168.3.170:45912: read: connection reset by peer"}
{"L":"INFO","T":"2024-07-07T13:04:23.619Z","N":"rclone","M":"Location check done"}

@Michal-Leszczynski
Copy link
Collaborator

I also notice "rclone" in the logs while I have only configured S3 in the scylla-manager-agent.yaml, defined in the scylla-manager-agent-config Scylla manager secret.

Rclone is the tool used by sm-agent to transfer the data to/from supported backup locations, so it is always there.

Is this intended behavior? I find it hard to believe that the location needs to be checked so frequently when my backup task is scheduled once every 4 hours.

That's true, the location should be checked only when adding/running backup task.
I suspect that this might be a ScyllaDB operator related issue where the task is frequently modified.

But in order to verify that, it would be good to collect logs with must-gather.
FYI @rzetelskik

@rzetelskik
Copy link
Member

That's true, the location should be checked only when adding/running backup task.
I suspect that this might be a ScyllaDB operator related issue where the task is frequently modified.

scylladb/scylla-operator#1827 leaving a cross-reference here for the record.

But in order to verify that, it would be good to collect logs with must-gather.

+1. @Michal-Leszczynski feel free to transfer this to the operator repo when you have the artifacts and verify it's not a manager issue.

@rwnd-bradley
Copy link
Author

@Michal-Leszczynski @rzetelskik Thanks for the reply, is there somewhere I can privately send you the output of must-gather?

@Michal-Leszczynski
Copy link
Collaborator

You can send it via email to [email protected]

@Michal-Leszczynski
Copy link
Collaborator

@rwnd-bradley I looked at the logs and it seems like this is indeed an example of scylladb/scylla-operator#1827.
This operator issue is awaiting for #3219 to be fixed first on manager side (I'm working on it right now).

Let's close this issue as a dup of scylladb/scylla-operator#1827 and watch its progress there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants