Skip to content

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #749

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #749

Triggered via pull request March 7, 2025 14:28
@horheynmhorheynm
synchronize #1233
attn_quant
Status Failure
Total duration 29m 32s
Artifacts

test-check-transformers.yaml

on: pull_request
detect-changes
5s
detect-changes
transformers-tests
27m 51s
transformers-tests
Fit to window
Zoom out
Zoom in

Annotations

4 errors
transformers-tests
Process completed with exit code 1.
transformers-tests
Process completed with exit code 1.
transformers-tests
Process completed with exit code 1.
transformers-tests
Process completed with exit code 1.