Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ci/transformers: run tests in utils, benchmark, generation, models #1190

Merged
merged 7 commits into from
Dec 24, 2024

Conversation

dvrogozh
Copy link
Contributor

No description provided.

@dvrogozh dvrogozh force-pushed the ci branch 5 times, most recently from eb77669 to 34685ab Compare December 21, 2024 05:50
@dvrogozh dvrogozh changed the title ci/transformers: run tests in tests/utils ci/transformers: run tests in utils, benchmark, generation, models Dec 23, 2024
@dvrogozh dvrogozh marked this pull request as ready for review December 23, 2024 05:29
@@ -122,18 +122,64 @@ jobs:
pip show torchvision | grep Version | grep xpu
python -c 'import torch; exit(not torch.xpu.is_available())'
- name: Run -k backbone tests
env:
TEST_CASE: 'tests_backbone'
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we add a parameter test_cases for this workflow? And this parameter can be set as combination of tests_backbone,tests_py,tests_benchmark,tests_generation,tests_models,tests_pipelines,tests_trainer,tests_utils, and we also can set a default value for it as default test scope. Each test step need according to the value of test_cases to decide whether to do test.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM. I also prefer chuanqi's comments about a new parameter for choose test scopes.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, test cases parameter is one of those things we should do next. My thoughts on that are as follows:

  • As a developer I want some flexibility to set test cases when running on-demand workflow. For example, I see that running tests for one specific model might be needed, or running all the tests matching some conditions. With that in mind I incline to avoid setting pre-defined test scopes only (like tests_backbone, tests_py, etc.). Instead I think that specifying any input which is then passed to the pytest is preferable. Like:
test_cases="tests/model/bert/ -k sdpa"
...
python -m pytest --make-reports=... <other-common-params> $test_cases
  • I wonder how we will structure ci for other ecosystem projects? I mean that currently w/o test cases parameter we can add, for example, Huggingface Accelerate tests into the same workflow. But introducing test cases parameter might complicate that addition.

@dvrogozh
Copy link
Contributor Author

Change is specific to Transformers tests. Failures in other ci tests are unrelated. Can be merged. Let's do test cases change as a follow up change to this one.

@dvrogozh dvrogozh added this pull request to the merge queue Dec 24, 2024
Merged via the queue into intel:main with commit 212ee90 Dec 24, 2024
4 of 7 checks passed
@dvrogozh dvrogozh deleted the ci branch December 24, 2024 02:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants