pytest-runtime-xfail
pytest plugin, providing a runtime_xfail fixture, which is callable as runtime_xfail(), to allow runtime decisions to mark a test as xfail.
Installation
Install with pip:
pip install pytest-runtime-xfail
Usage
Include the fixture, then call it if you want to mark a test as xfail during runtime.
def test_something(runtime_xfail):
if (runtime_condition):
runtime_xfail()
# ... the rest of your test
Can also be used in a fixture, of course.
@pytest.fixture()
def foo(runtime_xfail):
if (runtime_condition):
runtime_xfail()
# ... the rest of your fixture
def test_something(foo):
# ... the rest of your test
Reason this plugin is needed
pytest allows you to mark tests as expected to fail, or xfail, in two ways.
-
@pytest.mark.xfail. This allows you to mark tests or test parametrizations asxfailduring test collection time.- pytest runs tests marked with
xfailjust like any other test. - If the test fails, it will result in
XFAIL. - If it passes,
XPASS. Unless you havexfail_strict=trueor@pytest.mark.xfail(strict=True), in which case, passing xfail-marked tests will result inFAIL.- This is useful to be alerted when an expected failing test starts to pass.
- pytest runs tests marked with
-
pytest.xfail(). If you need information only known at runtime to decide ifxfailis appropriate, you can callpytest.xfail()during a test or fixture.- pytest runs the test as normal UNTIL
pytest.xfail()is called. - When
pytest.xfail()is called, the test execution stops and the test results inXFAIL. - The rest of the test is not run.
- There is no way to get
XPASSfrompytest.xfail(). xfail_stricthas no effect.
- pytest runs the test as normal UNTIL
There are times when we want a combination of these behaviors.
- We don't know until runtime if we should mark a test as
xfail. - We want the test run.
- We want the possibility of both
XFAILandXPASSresults. - We want to be able to use
xfail_strict=trueto alert us when the test starts passing.
This plugin fills that gap.
Alternatives
You can get around the same limitation yourself by adding the marker through the requests object:
def test_something(request):
if (runtime_condition):
request.node.add_marker(pytest.mark.xfail(reason='some reason'))
# ... rest of test
That's basically what this plugin does, just in a fixture.
Example found in example/test_xfail.py
"""
Run this with
* pytest -v
* pytest -v -o xfail_strict=true
"""
import pytest
@pytest.mark.xfail()
def test_marker_pass():
'Can be XPASS or FAIL (if xfail_strict)'
assert True
@pytest.mark.xfail()
def test_marker_fail():
'Will always be XFAIL'
assert False # this statememt will be run
def test_old_xfail_pass():
'Will always be XFAIL'
pytest.xfail()
assert True # this statememt will NOT be run
def test_old_xfail_fail():
'Will always be XFAIL'
pytest.xfail()
assert False # this statememt will NOT be run
def test_runtime_xfail_pass(runtime_xfail):
runtime_xfail()
assert True # this statement will be run
def test_runtime_xfail_fail(runtime_xfail):
runtime_xfail()
assert False # this statement will be run
def test_runtime_xfail_reason(runtime_xfail):
runtime_xfail(reason="for demo")
assert False # this statement will be run
Output:
(venv) $ pytest -v test_xfail.py
========================= test session starts ==========================
collected 7 items
test_xfail.py::test_marker_pass XPASS [ 14%]
test_xfail.py::test_marker_fail XFAIL [ 28%]
test_xfail.py::test_old_xfail_pass XFAIL [ 42%]
test_xfail.py::test_old_xfail_fail XFAIL [ 57%]
test_xfail.py::test_runtime_xfail_pass XPASS [ 71%]
test_xfail.py::test_runtime_xfail_fail XFAIL [ 85%]
test_xfail.py::test_runtime_xfail_reason XFAIL (for demo) [100%]
==================== 5 xfailed, 2 xpassed in 0.05s =====================
(venv) $ pytest -v test_xfail.py -o xfail_strict=true
========================= test session starts ==========================
collected 7 items
test_xfail.py::test_marker_pass FAILED [ 14%]
test_xfail.py::test_marker_fail XFAIL [ 28%]
test_xfail.py::test_old_xfail_pass XFAIL [ 42%]
test_xfail.py::test_old_xfail_fail XFAIL [ 57%]
test_xfail.py::test_runtime_xfail_pass FAILED [ 71%]
test_xfail.py::test_runtime_xfail_fail XFAIL [ 85%]
test_xfail.py::test_runtime_xfail_reason XFAIL (for demo) [100%]
===================== 2 failed, 5 xfailed in 0.04s =====================