0
0
PyTesttesting~15 mins

Custom markers in PyTest - Build an Automation Script

Choose your learning style9 modes available
Automate test execution with a custom pytest marker
Preconditions (3)
Step 1: Run pytest with the option to only execute tests marked with 'slow'
Step 2: Verify that only the test(s) with the 'slow' marker are executed
Step 3: Verify that tests without the 'slow' marker are skipped
Step 4: Check the test report to confirm the number of tests run matches the number of 'slow' marked tests
✅ Expected Result: Only tests marked with the 'slow' marker run and pass or fail as expected; other tests are skipped; test report shows correct test count
Automation Requirements - pytest
Assertions Needed:
Assert that the test output includes only tests with the 'slow' marker
Assert that tests without the 'slow' marker are not run
Assert that the test report summary matches the expected number of tests run
Best Practices:
Use pytest's @pytest.mark decorator to mark tests
Use pytest command line option '-m' to select tests by marker
Use pytest's capsys or caplog fixtures to capture output if needed
Avoid hardcoding test names; rely on marker filtering
Include marker registration in pytest.ini to avoid warnings
Automated Solution
PyTest
import pytest
import subprocess
import sys

# test_sample.py content
# Save this content in the same directory before running automation
#
# import pytest
#
# @pytest.mark.slow
# def test_slow_example():
#     assert True
#
# def test_fast_example():
#     assert True


def test_run_only_slow_marker():
    # Run pytest with -m slow to run only tests marked as slow
    result = subprocess.run(
        [sys.executable, '-m', 'pytest', '-v', '-m', 'slow', 'test_sample.py'],
        capture_output=True,
        text=True
    )

    output = result.stdout

    # Assert the command ran successfully
    assert result.returncode == 0, f"Pytest failed with output:\n{output}"

    # Check that the output contains the slow test name
    assert 'test_slow_example' in output, "Slow test did not run"

    # Check that the fast test did not run
    assert 'test_fast_example' not in output, "Fast test should not run when filtering by slow marker"

    # Check the test summary line for correct count
    # Example summary line: '1 passed in 0.03s'
    import re
    match = re.search(r'(\d+) passed', output)
    assert match is not None, "No tests passed"
    passed_count = int(match.group(1))
    assert passed_count == 1, f"Expected 1 test to run, but {passed_count} ran"

This automation script runs pytest as a subprocess with the -m slow option to select only tests marked with the slow marker.

It captures the output and checks:

  • The pytest command succeeded (return code 0).
  • The output contains the slow test name test_slow_example, confirming it ran.
  • The output does not contain the fast test name test_fast_example, confirming it was skipped.
  • The test summary shows exactly one test passed, matching the expected count.

This approach uses subprocess to run pytest as a real user would, ensuring marker filtering works as expected.

Before running this automation, ensure test_sample.py exists with the given test functions and the slow marker is registered in pytest.ini to avoid warnings.

Common Mistakes - 4 Pitfalls
Not registering the custom marker in pytest.ini
Using hardcoded test names in assertions
Running pytest without the -m option to filter markers
Not capturing output to verify which tests ran
Bonus Challenge

Now add data-driven testing with 3 different inputs to the slow test using pytest.mark.parametrize

Show Hint