-
Notifications
You must be signed in to change notification settings - Fork 23
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ci/transformers: run tests in utils, benchmark, generation, models #1190
Conversation
eb77669
to
34685ab
Compare
Signed-off-by: Dmitry Rogozhkin <[email protected]>
Signed-off-by: Dmitry Rogozhkin <[email protected]>
Signed-off-by: Dmitry Rogozhkin <[email protected]>
Signed-off-by: Dmitry Rogozhkin <[email protected]>
Signed-off-by: Dmitry Rogozhkin <[email protected]>
Signed-off-by: Dmitry Rogozhkin <[email protected]>
Signed-off-by: Dmitry Rogozhkin <[email protected]>
@@ -122,18 +122,64 @@ jobs: | |||
pip show torchvision | grep Version | grep xpu | |||
python -c 'import torch; exit(not torch.xpu.is_available())' | |||
- name: Run -k backbone tests | |||
env: | |||
TEST_CASE: 'tests_backbone' |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we add a parameter test_cases
for this workflow? And this parameter can be set as combination of tests_backbone,tests_py,tests_benchmark,tests_generation,tests_models,tests_pipelines,tests_trainer,tests_utils
, and we also can set a default value for it as default test scope. Each test step need according to the value of test_cases
to decide whether to do test.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall LGTM. I also prefer chuanqi's comments about a new parameter for choose test scopes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, test cases parameter is one of those things we should do next. My thoughts on that are as follows:
- As a developer I want some flexibility to set test cases when running on-demand workflow. For example, I see that running tests for one specific model might be needed, or running all the tests matching some conditions. With that in mind I incline to avoid setting pre-defined test scopes only (like
tests_backbone, tests_py, etc.
). Instead I think that specifying any input which is then passed to the pytest is preferable. Like:
test_cases="tests/model/bert/ -k sdpa"
...
python -m pytest --make-reports=... <other-common-params> $test_cases
- I wonder how we will structure ci for other ecosystem projects? I mean that currently w/o test cases parameter we can add, for example, Huggingface Accelerate tests into the same workflow. But introducing test cases parameter might complicate that addition.
Change is specific to Transformers tests. Failures in other ci tests are unrelated. Can be merged. Let's do test cases change as a follow up change to this one. |
No description provided.