Impact of compression on scrubs #13139
-
I'm trying to understand how compression is treated during a scrub. Let's say I have a dataset or zvol that's 1TB in size with a 50% compression (1.50x compressratio). Is the data first decompressed and then checked, effectively scrubbing the full 1TB of data, or is only 500GB of data scrubbed? |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
What I'm trying to understand from my question is what's the order of the operations in the chain. I ended up finding the answer: |
Beta Was this translation helpful? Give feedback.
What I'm trying to understand from my question is what's the order of the operations in the chain. I ended up finding the answer:
When a file is written, the data is compressed, encrypted, and the checksum is verified. Then, the data is deduplicated, if possible.