Skip to content

Commit

Permalink
speed up outcome file compression
Browse files Browse the repository at this point in the history
Signed-off-by: Dave Rodgman <[email protected]>
  • Loading branch information
daverodgman committed Jan 2, 2024
1 parent 135c5f5 commit c662496
Showing 1 changed file with 4 additions and 4 deletions.
8 changes: 4 additions & 4 deletions vars/analysis.groovy
Original file line number Diff line number Diff line change
Expand Up @@ -237,16 +237,16 @@ def process_outcomes() {
deleteDir()
}

// The complete outcome file is ~14MB compressed as I write.
// The complete outcome file is 2.1GB uncompressed / 56MB compressed as I write.
// Often we just want the failures, so make an artifact with just those.
// Only produce a failure file if there was a failing job (otherwise
// we'd just waste time creating an empty file).
if (gen_jobs.failed_builds) {
sh '''\
awk -F';' '$5 == "FAIL"' outcomes.csv >"failures.csv"
grep ';FAIL;' outcomes.csv >"failures.csv"
# Compress the failure list if it is large (for some value of large)
if [ "$(wc -c <failures.csv)" -gt 99999 ]; then
xz failures.csv
xz -0 -T8 failures.csv
fi
'''
}
Expand All @@ -258,7 +258,7 @@ fi
}
}
} finally {
sh 'xz outcomes.csv'
sh 'xz -0 -T8 outcomes.csv'
archiveArtifacts(artifacts: 'outcomes.csv.xz, failures.csv*',
fingerprint: true,
allowEmptyArchive: true)
Expand Down

0 comments on commit c662496

Please sign in to comment.