Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

CI: More useful checks? #6

Open
1 of 3 tasks
sadlerap opened this issue Jul 28, 2022 · 3 comments
Open
1 of 3 tasks

CI: More useful checks? #6

sadlerap opened this issue Jul 28, 2022 · 3 comments
Assignees

Comments

@sadlerap
Copy link
Contributor

sadlerap commented Jul 28, 2022

Our current CI testing for conformance testing isn't going to work long-term in my opinion. Running against an empty cluster isn't going to yield meaningful results: while it does run the code, execution stops as soon as it actually tries to resolve a service binding since there isn't anything doing the resolution.

As it stands, I think we have a few options for replacements:

I'm not super happy about the second option, since testing tests doesn't seem like a good path to me. The other options feel like good places for improvement.

Thoughts?

@scothis
Copy link
Member

scothis commented Jul 28, 2022

Can we run the conformance tests against a pinned "latest" release of the reference implementation?

Any particular implementation will likely have most test pass (hopefully), but there may also be tests that fail. This is ok as long as we can capture that a particular test is expected to fail. Passing this repo's CI means that the conformance test that are expected to pass, pass; and the test that are expected to fail, fail.

@sadlerap
Copy link
Contributor Author

When we make a release of the reference implementation, this should be feasible. We should leave it to the implementations that we test to declare which tests they expect to pass and fail.

@baijum
Copy link
Contributor

baijum commented Jul 29, 2022

which tests they expect to pass and fail.

For implementations to selectively run conformance tests, we can add different classifications in the form of Behave tags. Few examples:

  • @core
  • @direct-secret-reference
  • @workload-resource-mapping
  • @rbac
  • @webhook
  • @immutable-resource
  • @label-selector
  • @kube.version.1.21
  • @kube.version.1.22

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants