Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Codegen][Tuner] attr verifier for tuning specs #19486

Merged
merged 9 commits into from
Dec 18, 2024

Conversation

bangtianliu
Copy link
Contributor

@bangtianliu bangtianliu commented Dec 16, 2024

This PR is relevant to task in #19214: add a discardable attr verifier for entry points iree_codegen.tuning_spec_entrypoint

@bangtianliu bangtianliu requested a review from hanhanW as a code owner December 16, 2024 04:11
@bangtianliu bangtianliu marked this pull request as draft December 16, 2024 04:12
@bangtianliu bangtianliu requested review from kuhar and removed request for hanhanW December 16, 2024 04:12
@bangtianliu bangtianliu changed the title [Codegen][Tuner] attr verifier for truning specs [Codegen][Tuner] attr verifier for tuning specs Dec 16, 2024
@bangtianliu bangtianliu marked this pull request as ready for review December 16, 2024 23:30
Signed-off-by: Bangtian Liu <[email protected]>
@bangtianliu bangtianliu requested a review from kuhar December 17, 2024 22:04
@bangtianliu bangtianliu requested a review from kuhar December 18, 2024 18:52
@bangtianliu bangtianliu requested a review from kuhar December 18, 2024 19:45
Copy link
Member

@kuhar kuhar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Mostly LGTM now

Comment on lines 149 to 151
return (*defaultTransformLibrary).emitError()
<< "Verification failed for default tuning spec";
<< "Default tuning spec " << defaultTuningSpecName
<< " failed to verify";
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have you confirmed that this error gets emitted when the default spec is invalid?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@bangtianliu can you please reply instead of marking this as resolved? I don't know what the outcome of this is.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I did check it and fixed the crashes.

Copy link
Member

@kuhar kuhar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM % one nit

Signed-off-by: Bangtian Liu <[email protected]>
@MaheshRavishankar
Copy link
Contributor

Could you explain a little bit more about what this is trying to verify. I am just trying to fit it within my mental model. Thanks in advance for explaining it to me.

@bangtianliu bangtianliu merged commit e553425 into iree-org:main Dec 18, 2024
40 checks passed
@bangtianliu
Copy link
Contributor Author

bangtianliu commented Dec 18, 2024

Could you explain a little bit more about what this is trying to verify. I am just trying to fit it within my mental model. Thanks in advance for explaining it to me.

The primary purpose of adding a verifier is to ensure the correctness of all provided tuning specifications, including the default tuning spec, user-provided tuning specs, and the output tuning specs after linking.

Currently, this PR is primarily on verifying:

  • If an attribute's name matches kTuningSpecEntrypointAttrName ("iree_codegen.tuning_spec_entrypoint"), the attribute value must be a UnitAttr.
  • The constraints related to the signature of transform::NamedSequenceOp

A detailed explanation can be found: https://github.com/iree-org/iree/blob/main/compiler/src/iree/compiler/Codegen/Dialect/Codegen/IR/IREECodegenDialect.cpp#L55

@MaheshRavishankar
Copy link
Contributor

Could you explain a little bit more about what this is trying to verify. I am just trying to fit it within my mental model. Thanks in advance for explaining it to me.

The primary purpose of adding a verifier is to ensure the correctness of all provided tuning specifications, including the default tuning spec, user-provided tuning specs, and the output tuning specs after linking.

Currently, this PR is primarily on verifying:

  • If an attribute's name matches kTuningSpecEntrypointAttrName ("iree_codegen.tuning_spec_entrypoint"), the attribute value must be a UnitAttr.

What is the entry point attr name for?

  • The constraints related to the signature of transform::NamedSequenceOp

A detailed explanation can be found: https://github.com/iree-org/iree/blob/main/compiler/src/iree/compiler/Codegen/Dialect/Codegen/IR/IREECodegenDialect.cpp#L55

I don't really follow the reason for these constraints

@bangtianliu
Copy link
Contributor Author

bangtianliu commented Dec 19, 2024

What is the entry point attr name for?
I don't really follow the reason for these constraints

The current tuning design leverages both default and user-provided tuning specs (specifications) to override compiler heuristics and guide dispatch code generation effectively.

The tunning specs are transform dialect libraries and must adhere to a specific format (as outlined in the tuning documentation):

  • All tuning spec entry points (named sequence ops) are marked with the iree_codegen.tuning_spec_entrypoint attribute. They have a single argument of type !transform.any_op and return a single value of type !transform.any_op.

Therefore, the entry point attr (name) iree_codegen.tuning_spec_entrypoint is essential for marking named sequence ops in either default or user-provided tuning specs, and these constraints arise directly from the format requirements of named sequence ops.

For more details, please refer to the tuning documentation: https://iree.dev/reference/tuning/.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants