The problem: I have a single test function that I want to run many times on different data. I don't want to just get a single pass/fail result — I want to see whether each item of data makes the test pass or fail individually. But I also don't want to write a whole bunch of nearly identical tests, with only the data changing each time.
The solution: "parametrization" in the Pytest tool. Parametrization means defining parameters that will populate your function-arguments.
Define an iterable (here called ) containing the multiple items each of which you want passed to your test functions. Both it and a string containing an argument name (here called ) are passed to a decorator called on any test function (here called ).
The names , , and are dummies that you can change to whatever you want, although the name of the function has to start with in order for Pytest to discover it.
You don't need to iterate manually through or summon specific indices of it, as , etc. Pytest will handle that. Pytest will iterate through and assign each of its elements, in turn, to the target variable . It will call your test function on that argument over and over again, with the argument representing each of the elements of in turn, and report the results to you one by one.
Run from the command line as
- can contain as many test-functions as you like, and each can be decorated with if needed.
- Pytest normally runs all tests in sequence even if some fail. If you want Pytest to stop on the first failing test, use option .
- To see which arguments are passed into each test, use option . Unless the arguments are strings, numbers, or booleans, they will probably appear as automatically generated id-names — to include an identifying string for each data item, do so using the keyword-argument :
can also contain multiple arguments, comma-separated, if your test function takes multiple arguments (and if contains tuples). For instance
I have used this pattern to pass in 2-tuples containing an external function to be called and a boolean describing whether the external function is expected to return or . (A more official way of marking specific test cases as "expected to fail" is with the function, within .)
If you want to see the actual data passed into each iteration of the test function, use the decorator, with the keyword:
Note that the first decorator example above is shorthand for the following:
Here are links to more details on Pytest parametrization and examples.
pytest fixtures: explicit, modular, scalable¶
New in version 2.0/2.3/2.4.
The purpose of test fixtures is to provide a fixed baseline upon which tests can reliably and repeatedly execute. pytest fixtures offer dramatic improvements over the classic xUnit style of setup/teardown functions:
- fixtures have explicit names and are activated by declaring their use from test functions, modules, classes or whole projects.
- fixtures are implemented in a modular manner, as each fixture name triggers a fixture function which can itself use other fixtures.
- fixture management scales from simple unit to complex functional testing, allowing to parametrize fixtures and tests according to configuration and component options, or to re-use fixtures across function, class, module or whole test session scopes.
In addition, pytest continues to support classic xunit-style setup. You can mix both styles, moving incrementally from classic to new style, as you prefer. You can also start out from existing unittest.TestCase style or nose based projects.
Fixtures as Function arguments¶
Test functions can receive fixture objects by naming them as an input argument. For each argument name, a fixture function with that name provides the fixture object. Fixture functions are registered by marking them with . Let’s look at a simple self-contained test module containing a fixture and a test function using it:
Here, the needs the fixture value. pytest will discover and call the marked fixture function. Running the test looks like this:
In the failure traceback we see that the test function was called with a argument, the instance created by the fixture function. The test function fails on our deliberate . Here is the exact protocol used by to call the test function this way:
- pytest finds the because of the prefix. The test function needs a function argument named . A matching fixture function is discovered by looking for a fixture-marked function named .
- is called to create an instance.
- is called and fails in the last line of the test function.
Note that if you misspell a function argument or want to use one that isn’t available, you’ll see an error with a list of available function arguments.
You can always issue:
to see available fixtures.
Fixtures: a prime example of dependency injection¶
Fixtures allow test functions to easily receive and work against specific pre-initialized application objects without having to care about import/setup/cleanup details. It’s a prime example of dependency injection where fixture functions take the role of the injector and test functions are the consumers of fixture objects.
: sharing fixture functions¶
If during implementing your tests you realize that you want to use a fixture function from multiple test files you can move it to a file. You don’t need to import the fixture you want to use in a test, it automatically gets discovered by pytest. The discovery of fixture functions starts at test classes, then test modules, then files and finally builtin and third party plugins.
You can also use the file to implement local per-directory plugins.
Sharing test data¶
If you want to make test data from files available to your tests, a good way to do this is by loading these data in a fixture for use by your tests. This makes use of the automatic caching mechanisms of pytest.
Another good approach is by adding the data files in the folder. There are also community plugins available to help managing this aspect of testing, e.g. pytest-datadir and pytest-datafiles.
Scope: sharing a fixture instance across tests in a class, module or session¶
Fixtures requiring network access depend on connectivity and are usually time-expensive to create. Extending the previous example, we can add a parameter to the invocation to cause the decorated fixture function to only be invoked once per test module (the default is to invoke once per test function). Multiple test functions in a test module will thus each receive the same fixture instance, thus saving time.
The next example puts the fixture function into a separate file so that tests from multiple test modules in the directory can access the fixture function:
The name of the fixture again is and you can access its result by listing the name as an input parameter in any test or fixture function (in or below the directory where is located):
We deliberately insert failing statements in order to inspect what is going on and can now run the tests:
You see the two failing and more importantly you can also see that the same (module-scoped) object was passed into the two test functions because pytest shows the incoming argument values in the traceback. As a result, the two test functions using run as quick as a single one because they reuse the same instance.
If you decide that you rather want to have a session-scoped instance, you can simply declare it:
Finally, the scope will invoke the fixture once per test class.
Fixture finalization / executing teardown code¶
pytest supports execution of fixture specific finalization code when the fixture goes out of scope. By using a statement instead of , all the code after the yield statement serves as the teardown code:
The and statements will execute when the last test in the module has finished execution, regardless of the exception status of the tests.
Let’s execute it:
We see that the instance is finalized after the two tests finished execution. Note that if we decorated our fixture function with then fixture setup and cleanup would occur around each single test. In either case the test module itself does not need to change or know about these details of fixture setup.
Note that we can also seamlessly use the syntax with statements:
The connection will be closed after the test finished execution because the object automatically closes when the statement ends.
Note that if an exception happens during the setup code (before the keyword), the teardown code (after the ) will not be called.
An alternative option for executing teardown code is to make use of the method of the request-context object to register finalization functions.
Here’s the fixture changed to use for cleanup:
Both and methods work similarly by calling their code after the test ends, but has two key differences over :
It is possible to register multiple finalizer functions.
Finalizers will always be called regardless if the fixture setup code raises an exception. This is handy to properly close all resources created by a fixture even if one of them fails to be created/acquired:
In the example above, if fails with an exception, and will still be properly closed. Of course, if an exception happens before the finalize function is registered then it will not be executed.@pytest.fixturedefequipments(request):r=forportin('C1','C3','C28'):equip=connect(port)request.addfinalizer(equip.disconnect)r.append(equip)returnr
Fixtures can introspect the requesting test context¶
Fixture functions can accept the object to introspect the “requesting” test function, class or module context. Further extending the previous fixture example, let’s read an optional server URL from the test module which uses our fixture:
We use the attribute to optionally obtain an attribute from the test module. If we just execute again, nothing much has changed:
Let’s quickly create another test module that actually sets the server URL in its module namespace:
voila! The fixture function picked up our mail server name from the module namespace.
Fixture functions can be parametrized in which case they will be called multiple times, each time executing the set of dependent tests, i. e. the tests that depend on this fixture. Test functions do usually not need to be aware of their re-running. Fixture parametrization helps to write exhaustive functional tests for components which themselves can be configured in multiple ways.
Extending the previous example, we can flag the fixture to create two fixture instances which will cause all tests using the fixture to run twice. The fixture function gets access to each parameter through the special object:
The main change is the declaration of with , a list of values for each of which the fixture function will execute and can access a value via . No test function code needs to change. So let’s just do another run:
We see that our two test functions each ran twice, against the different instances. Note also, that with the connection the second test fails in because a different server string is expected than what arrived.
pytest will build a string that is the test ID for each fixture value in a parametrized fixture, e.g. and in the above examples. These IDs can be used with to select specific cases to run, and they will also identify the specific case when one is failing. Running pytest with will show the generated IDs.
Numbers, strings, booleans and None will have their usual string representation used in the test ID. For other objects, pytest will make a string based on the argument name. It is possible to customise the string used in a test ID for a certain fixture value by using the keyword argument:
The above shows how can be either a list of strings to use or a function which will be called with the fixture value and then has to return a string to use. In the latter case if the function return then pytest’s auto-generated ID will be used.
Running the above tests results in the following test IDs being used:
Modularity: using fixtures from a fixture function¶
You can not only use fixtures in test functions but fixture functions can use other fixtures themselves. This contributes to a modular design of your fixtures and allows re-use of framework-specific fixtures across many projects. As a simple example, we can extend the previous example and instantiate an object where we stick the already defined resource into it:
Here we declare an fixture which receives the previously defined fixture and instantiates an object with it. Let’s run it:
Due to the parametrization of the test will run twice with two different instances and respective smtp servers. There is no need for the fixture to be aware of the parametrization as pytest will fully analyse the fixture dependency graph.
Note, that the fixture has a scope of and uses a module-scoped fixture. The example would still work if was cached on a scope: it is fine for fixtures to use “broader” scoped fixtures but not the other way round: A session-scoped fixture could not use a module-scoped one in a meaningful way.
Automatic grouping of tests by fixture instances¶
pytest minimizes the number of active fixtures during test runs. If you have a parametrized fixture, then all the tests using it will first execute with one instance and then finalizers are called before the next fixture instance is created. Among other things, this eases testing of applications which create and use global state.
The following example uses two parametrized fixture, one of which is scoped on a per-module basis, and all the functions perform calls to show the setup/teardown flow:
Let’s run the tests in verbose mode and with looking at the print-output:
You can see that the parametrized module-scoped resource caused an ordering of test execution that lead to the fewest possible “active” resources. The finalizer for the parametrized resource was executed before the resource was setup.
In particular notice that test_0 is completely independent and finishes first. Then test_1 is executed with , then test_2 with , then test_1 with and finally test_2 with .
The parametrized resource (having function scope) was set up before and teared down after every test that used it.
Using fixtures from classes, modules or projects¶
Sometimes test functions do not directly need access to a fixture object. For example, tests may require to operate with an empty directory as the current working directory but otherwise do not care for the concrete directory. Here is how you can use the standard tempfile and pytest fixtures to achieve it. We separate the creation of the fixture into a conftest.py file:
and declare its use in a test module via a marker:
Due to the marker, the fixture will be required for the execution of each test method, just as if you specified a “cleandir” function argument to each of them. Let’s run it to verify our fixture is activated and the tests pass:
You can specify multiple fixtures like this:
and you may specify fixture usage at the test module level, using a generic feature of the mark mechanism:
Note that the assigned variable must be called , assigning e.g. will not activate the fixtures.
Lastly you can put fixtures required by all tests in your project into an ini-file:
Autouse fixtures (xUnit setup on steroids)¶
Occasionally, you may want to have fixtures get invoked automatically without declaring a function argument explicitly or a usefixtures decorator. As a practical example, suppose we have a database fixture which has a begin/rollback/commit architecture and we want to automatically surround each test method by a transaction and a rollback. Here is a dummy self-contained implementation of this idea:
The class-level fixture is marked with autouse=true which implies that all test methods in the class will use this fixture without a need to state it in the test function signature or with a class-level decorator.
If we run it, we get two passing tests:
Here is how autouse fixtures work in other scopes:
- autouse fixtures obey the keyword-argument: if an autouse fixture has it will only be run once, no matter where it is defined. means it will be run once per class, etc.
- if an autouse fixture is defined in a test module, all its test functions automatically use it.
- if an autouse fixture is defined in a conftest.py file then all tests in all test modules below its directory will invoke the fixture.
- lastly, and please use that with care: if you define an autouse fixture in a plugin, it will be invoked for all tests in all projects where the plugin is installed. This can be useful if a fixture only anyway works in the presence of certain settings e. g. in the ini-file. Such a global fixture should always quickly determine if it should do any work and avoid otherwise expensive imports or computation.
Note that the above fixture may very well be a fixture that you want to make available in your project without having it generally active. The canonical way to do that is to put the transact definition into a conftest.py file without using :
and then e.g. have a TestClass using it by declaring the need:
All test methods in this TestClass will use the transaction fixture while other test classes or functions in the module will not use it unless they also add a reference.
Overriding fixtures on various levels¶
In relatively large test suite, you most likely need to a or fixture with a defined one, keeping the test code readable and maintainable.
Override a fixture on a folder (conftest) level¶
Given the tests file structure is:
As you can see, a fixture with the same name can be overridden for certain test folder level. Note that the or fixture can be accessed from the fixture easily - used in the example above.
Override a fixture on a test module level¶
Given the tests file structure is:
In the example above, a fixture with the same name can be overridden for certain test module.
Override a fixture with direct test parametrization¶
Given the tests file structure is:
In the example above, a fixture value is overridden by the test parameter value. Note that the value of the fixture can be overridden this way even if the test doesn’t use it directly (doesn’t mention it in the function prototype).
Override a parametrized fixture with non-parametrized one and vice versa¶
Given the tests file structure is:
In the example above, a parametrized fixture is overridden with a non-parametrized version, and a non-parametrized fixture is overridden with a parametrized version for certain test module. The same applies for the test folder level obviously.