-
-
Notifications
You must be signed in to change notification settings - Fork 2.8k
Smoke tests for assorted plugins #7721
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
579a78d
to
7086e44
Compare
This adds a new tox environment that tests that installation of popular plugins does not break pytest. We can add more as time passes.
Regression introduced in pytest-dev#7700
7086e44
to
0a27f69
Compare
@ssbarnea I've update the PR to include simple integration tests for We should backport the commit that adds the integration tests back to |
Probably that configuration is there for historical reasons, but doesn't make much sense to collect any *.py in testing as a potential test nowadays.
I think this is a good starting point, but if others feel some plugin is missing, please speak up! |
You need to ensure that the tests are actually run. Add a failing case with strict |
You may want to look at the PYTEST_REQPASS feature added by pytest-plus which assures that an exact number of tests passed. That was aimed for CI/CD usage for avoiding accidents where tests were skipped. |
tox.ini
Outdated
pytest pytest_trio_integration.py | ||
pytest pytest_anyio_integration.py |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we should test all the async frameworks simultaneously - they have all agreed to use marks to allow them to all interoperate in a single test run, so we should test that here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At least as long as you don't use async fixtures. pytest-asyncio
is known to appropriate those regardless of marks (see pytest-dev/pytest-asyncio#124 for the issue).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
At least as long as you don't use async fixtures.
That's a very good point, we currently don't test any async fixtures here.
@ssbarnea what's the scope of this you're happy to accept? I think as plugin maintainers become aware of this PR more will want included. I don't want to see a situation where this scope creeps all the way to testing all of pypi before it's merged! So I think we should merge this sooner and rather than later so maintainers can add their own smoke tests as PRs directly to pytest, rather than this branch? |
how popular is popular? Can we nail down a criteria for inclusion? |
Is that better than just requiring coverage in your tests? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The stated purpose is to test that pytest doesn't break. Shouldn't the full pytest suite be run instead of just a trivial case for each plugin?
xdist and cov are ok dor start. good point about avoiding scope creep. we can always add more later. |
This makes sense, but I would rather start things slowly:
Agreed! This is already a large improvement over what we have today, so let's get this in as soon as we feel this is good enough. |
Add 3 more: |
Hmm as I feared: pytest-qt, besides depending on a large library, also requires an X server. While it is possible, I don't want to introduce complicated setups at this stage. |
Anything else? If not it would be nice we could get some approvals here and get this merged. 👍 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a great idea! Looks good. I agree on doing light testing of each plugin, this is a quick smoke test* to check nothing exploded. Should catch the major things, and plugins are still encouraged to test against pytest pre-releases.
(*Etymology: 'The term originates in hardware repair and has been applied to software. It's intended to be a quick test to see if the application "catches on fire" when run for the first time.')
Oh, one thing to keep in mind: it's possible a new plugin release will break current master, and a new PR will start to fail because of the plugin release, and not the PR changes. In such cases, it's a good idea to check if a master build also fails in the same way. It may be needed to temporarily skip smoke testing that plugin to prevent all PRs failing, until either pytest or the plugin has a fix (because it could be a problem on either side, or both!). But hopefully this would be rare. And in any case, it's good information to know there's a failure. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
As the person doing most of the breaking, this will be very helpful!
Thanks everyone! |
Of course everyone feel free to add more plugins that they consider relevant. 👍 |
Co-authored-by: Bruno Oliveira <[email protected]> Co-authored-by: Thomas Grainger <[email protected]> Co-authored-by: Kyle Altendorf <[email protected]>
[6.0.x] Smoke tests for assorted plugins (#7721)
This aims to create a tox environment that tests that installation of
popular plugins does not break pytest.
Related to pytest-dev/pytest-cov#430