Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Data Providers and Skipped Tests #6011

Open
kevinrsoursib opened this issue Oct 22, 2024 · 2 comments
Open

Data Providers and Skipped Tests #6011

kevinrsoursib opened this issue Oct 22, 2024 · 2 comments
Labels
feature/data-provider Data Providers feature/test-runner CLI test runner type/enhancement A new idea that should be implemented

Comments

@kevinrsoursib
Copy link

PHPUnit 10.5 tightened up the data providers to flag as a error if a data provider returns an empty set. This causes problems in the case where the test the data provider feeds isn't something that is able to run in every situations where the tests are run. I'll use the example of a test of a feature that relies on a PHP module which may be absent.

If the module isn't there we know when we are generating the data provider records that we're just going to skip the test entirely. There really isn't much point in generating a long list of records just to skip the test dozens of times (along with needlessly spamming the output). This is much more of a problem if generating the data is time consuming or requires the absent module (for instance the module defines constants that are needed as part of the test data sets).

Previously we could do something like:

if (class_exists('module_class_name'))
{
return [];
}

and things would work out fine. Now the only option appears to be generate the dataset regardless (if that's even possible) or generate a dummy dataset when the test conditions aren't met. Which isn't exactly hard, but it's an ugly workaround.

If we don't want to allow empty data sets, some way of marking the data provider as "skipped" in the same way as we would mark a test as skipped would be useful to avoid dummy data we aren't going to do anything with.

something like
if (class_exists('module_class_name'))
{
self::markTestSkipped('Requires Module');
}

@kevinrsoursib kevinrsoursib added the type/enhancement A new idea that should be implemented label Oct 22, 2024
@kubawerlos
Copy link
Contributor

the example of a test of a feature that relies on a PHP module which may be absent.

If the module isn't there we know when we are generating the data provider records that we're just going to skip the test entirely.

@kevinrsoursib this sounds to me like you want to use RequiresPhpExtension.

class Issue6011Test extends TestCase
{
    #[RequiresPhpExtension('I_DO_NOT_EXIST')]
    #[DataProvider('provideSomethingCases')]
    public function testSomething(): void
    {
        $this->fail('This should not be reached');
    }

    public static function provideSomethingCases(): array
    {
        return [];
    }
}

This results with:

OK, but some tests were skipped!
Tests: 1, Assertions: 0, Skipped: 1.

@kevinrsoursib
Copy link
Author

That could actually work (my use case is a little different but I think it could be handled similarly). I wouldn't have guessed from the documentation that that annotating the test function would affect the data provider though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature/data-provider Data Providers feature/test-runner CLI test runner type/enhancement A new idea that should be implemented
Projects
None yet
Development

No branches or pull requests

4 participants
@sebastianbergmann @kevinrsoursib @kubawerlos and others