Skip to content

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #749

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization

[Quantization] Channel-wise Output Activation Quantization for Attention QKV Modules + KV-cache channel quantization #749

Annotations

4 errors

transformers-tests

failed Mar 7, 2025 in 27m 51s