A simple plugin to ensure the execution of critical sections of code has not been impacted between releases.
- Parameterisation of profiling parameters
- Support for all time measurement units
You can install "pytest-performance" via pip from PyPI:
$ pip install pytest-performance
- Default
def my_func(*args, **kwargs):
return 123
def test_my_func(performance):
# Check my_func runs within 1 second for 10000 iterations.
result = performance(my_func)
assert result == 123
- Set custom time amount
def my_func(*args, **kwargs):
return 123
def test_my_func(performance):
# Check my_func runs within 10 seconds for 10000 iterations.
result = performance(my_func, target=10)
assert result == 123
- Set custom time amount and unit (pint units supported)
def my_func(*args, **kwargs):
return 123
def test_my_func(performance):
# Check my_func runs within 1 nanosecond for 10000 iterations.
result = performance(my_func, target=10, unit='ns')
assert result == 123
- Set custom time amount, unit and number of iterations
def my_func(*args, **kwargs):
return 123
def test_my_func(performance):
# Check my_func runs within 1 nanosecond for 10 iterations.
result = performance(my_func, target=10, unit='ns', iterations=10)
assert result == 123
- Fixture can be disabled by passing '--performance-skip' to pytest
Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.
Distributed under the terms of the MIT license, "pytest-performance" is free and open source software
If you encounter any problems, please file an issue along with a detailed description.