Coder Social home page Coder Social logo

fixtures's Introduction

fixtures: Fixtures with cleanups for testing and convenience

fixtures defines a Python contract for reusable state / support logic, primarily for unit testing. Helper and adaption logic is included to make it easy to write your own fixtures using the fixtures contract. Glue code is provided that makes using fixtures that meet the Fixtures contract in unittest compatible test cases easy and straight forward.

Dependencies

  • Python 3.8+ This is the base language fixtures is written in and for.
  • pbr Used for version and release management of fixtures.

The fixtures[streams] extra adds:

  • testtools <https://launchpad.net/testtools>

    testtools provides helpful glue functions for the details API used to report information about a fixture (whether its used in a testing or production environment).

For use in a unit test suite using the included glue, you will need a test environment that supports TestCase.addCleanup. Writing your own glue code is easy. Alternatively, you can simply use Fixtures directly without any support code.

To run the test suite for fixtures, testtools is needed.

Why Fixtures

Standard Python unittest provides no obvious method for making and reusing state needed in a test case other than by adding a method on the test class. This scales poorly - complex helper functions propagating up a test class hierarchy is a regular pattern when this is done. Mocking, while a great tool, doesn't itself prevent this (and helpers to mock complex things can accumulate in the same way if placed on the test class).

By defining a uniform contract where helpers have no dependency on the test class we permit all the regular code hygiene activities to take place without the distorting influence of being in a class hierarchy that is modelling an entirely different thing - which is what helpers on a TestCase suffer from.

About Fixtures

A fixture represents some state. Each fixture has attributes on it that are specific to the fixture. For instance, a fixture representing a directory that can be used for temporary files might have a attribute path.

Most fixtures have complete pydoc documentation, so be sure to check pydoc fixtures for usage information.

Creating Fixtures

Minimally, subclass Fixture, define _setUp to initialize your state, schedule a cleanup for when cleanUp is called, and you're done:

>>> import unittest
>>> import fixtures
>>> class NoddyFixture(fixtures.Fixture):
...     def _setUp(self):
...         self.frobnozzle = 42
...         self.addCleanup(delattr, self, 'frobnozzle')

This will initialize frobnozzle when setUp is called, and when cleanUp is called get rid of the frobnozzle attribute. Prior to version 1.3.0 fixtures recommended overriding setUp. This is still supported, but since it is harder to write leak-free fixtures in this fashion, it is not recommended.

If your fixture has diagnostic data - for instance the log file of an application server, or log messages - it can expose that by creating a content object (testtools.content.Content) and calling addDetail:

>>> from testtools.content import text_content
>>> class WithLog(fixtures.Fixture):
...     def _setUp(self):
...         self.addDetail('message', text_content('foo bar baz'))

The method useFixture will use another fixture, call setUp on it, call self.addCleanup(thefixture.cleanUp), attach any details from it and return the fixture. This allows simple composition of different fixtures:

>>> class ReusingFixture(fixtures.Fixture):
...     def _setUp(self):
...         self.noddy = self.useFixture(NoddyFixture())

There is a helper for adapting a function or function pair into Fixtures. It puts the result of the function in fn_result:

>>> import os.path
>>> import shutil
>>> import tempfile
>>> def setup_function():
...     return tempfile.mkdtemp()
>>> def teardown_function(fixture):
...     shutil.rmtree(fixture)
>>> fixture = fixtures.FunctionFixture(setup_function, teardown_function)
>>> fixture.setUp()
>>> print (os.path.isdir(fixture.fn_result))
True
>>> fixture.cleanUp()

This can be expressed even more pithily:

>>> fixture = fixtures.FunctionFixture(tempfile.mkdtemp, shutil.rmtree)
>>> fixture.setUp()
>>> print (os.path.isdir(fixture.fn_result))
True
>>> fixture.cleanUp()

Another variation is MethodFixture which is useful for adapting alternate fixture implementations to Fixture:

>>> class MyServer:
...    def start(self):
...        pass
...    def stop(self):
...        pass
>>> server = MyServer()
>>> fixture = fixtures.MethodFixture(server, server.start, server.stop)

You can also combine existing fixtures using CompoundFixture:

>>> noddy_with_log = fixtures.CompoundFixture([NoddyFixture(),
...                                            WithLog()])
>>> with noddy_with_log as x:
...     print (x.fixtures[0].frobnozzle)
42

The Fixture API

The example above introduces some of the Fixture API. In order to be able to clean up after a fixture has been used, all fixtures define a cleanUp method which should be called when a fixture is finished with.

Because it's nice to be able to build a particular set of related fixtures in advance of using them, fixtures also have a setUp method which should be called before trying to use them.

One common desire with fixtures that are expensive to create is to reuse them in many test cases; to support this the base Fixture also defines a reset which calls self.cleanUp(); self.setUp(). Fixtures that can more efficiently make themselves reusable should override this method. This can then be used with multiple test state via things like testresources, setUpClass, or setUpModule.

When using a fixture with a test you can manually call the setUp and cleanUp methods. More convenient though is to use the included glue from fixtures.TestWithFixtures which provides a mixin defining useFixture (camel case because unittest is camel case throughout) method. It will call setUp on the fixture, call self.addCleanup(fixture) to schedule a cleanup, and return the fixture. This lets one write:

>>> import testtools
>>> import unittest

Note that we use testtools.TestCase. testtools has it's own implementation of useFixture so there is no need to use fixtures.TestWithFixtures with testtools.TestCase:

>>> class NoddyTest(testtools.TestCase, fixtures.TestWithFixtures):
...     def test_example(self):
...         fixture = self.useFixture(NoddyFixture())
...         self.assertEqual(42, fixture.frobnozzle)
>>> result = unittest.TestResult()
>>> _ = NoddyTest('test_example').run(result)
>>> print (result.wasSuccessful())
True

Fixtures implement the context protocol, so you can also use a fixture as a context manager:

>>> with fixtures.FunctionFixture(setup_function, teardown_function) as fixture:
...    print (os.path.isdir(fixture.fn_result))
True

When multiple cleanups error, fixture.cleanUp() will raise a wrapper exception rather than choosing an arbitrary single exception to raise:

>>> import sys
>>> from fixtures.fixture import MultipleExceptions
>>> class BrokenFixture(fixtures.Fixture):
...     def _setUp(self):
...         self.addCleanup(lambda:1/0)
...         self.addCleanup(lambda:1/0)
>>> fixture = BrokenFixture()
>>> fixture.setUp()
>>> try:
...    fixture.cleanUp()
... except MultipleExceptions:
...    exc_info = sys.exc_info()
>>> print (exc_info[1].args[0][0].__name__)
ZeroDivisionError

Fixtures often expose diagnostic details that can be useful for tracking down issues. The getDetails method will return a dict of all the attached details but can only be called before cleanUp is called. Each detail object is an instance of testtools.content.Content:

>>> with WithLog() as l:
...     print(l.getDetails()['message'].as_text())
foo bar baz

Errors in setUp

The examples above used _setUp rather than setUp because the base class implementation of setUp acts to reduce the chance of leaking external resources if an error is raised from _setUp. Specifically, setUp contains a try/except block which catches all exceptions, captures any registered detail objects, and calls self.cleanUp before propagating the error. As long as you take care to register any cleanups before calling the code that may fail, this will cause them to be cleaned up. The captured detail objects are provided to the args of the raised exception.

If the error that occurred was a subclass of Exception then setUp will raise MultipleExceptions with the last element being a SetupError that contains the detail objects. Otherwise, to prevent causing normally uncatchable errors like KeyboardInterrupt being caught inappropriately in the calling layer, the original exception will be raised as-is and no diagnostic data other than that from the original exception will be available.

Shared Dependencies

A common use case within complex environments is having some fixtures shared by other ones.

Consider the case of testing using a TempDir with two fixtures built on top of it; say a small database and a web server. Writing either one is nearly trivial. However handling reset() correctly is hard: both the database and web server would reasonably expect to be able to discard operating system resources they may have open within the temporary directory before its removed. A recursive reset() implementation would work for one, but not both. Calling reset() on the TempDir instance between each test is probably desirable but we don't want to have to do a complete cleanUp of the higher layer fixtures (which would make the TempDir be unused and trivially resettable. We have a few options available to us.

Imagine that the webserver does not depend on the DB fixture in any way - we just want the webserver and DB fixture to coexist in the same tempdir.

A simple option is to just provide an explicit dependency fixture for the higher layer fixtures to use. This pushes complexity out of the core and onto users of fixtures:

>>> class WithDep(fixtures.Fixture):
...     def __init__(self, tempdir, dependency_fixture):
...         super(WithDep, self).__init__()
...         self.tempdir = tempdir
...         self.dependency_fixture = dependency_fixture
...     def setUp(self):
...         super(WithDep, self).setUp()
...         self.addCleanup(self.dependency_fixture.cleanUp)
...         self.dependency_fixture.setUp()
...         # we assume that at this point self.tempdir is usable.
>>> DB = WithDep
>>> WebServer = WithDep
>>> tempdir = fixtures.TempDir()
>>> db = DB(tempdir, tempdir)
>>> server = WebServer(tempdir, db)
>>> server.setUp()
>>> server.cleanUp()

Another option is to write the fixtures to gracefully handle a dependency being reset underneath them. This is insufficient if the fixtures would block the dependency resetting (for instance by holding file locks open in a tempdir - on Windows this will prevent the directory being deleted).

Another approach which fixtures neither helps nor hinders is to raise a signal of some sort for each user of a fixture before it is reset. In the example here, TempDir might offer a subscribers attribute that both the DB and web server would be registered in. Calling reset or cleanUp on the tempdir would trigger a callback to all the subscribers; the DB and web server reset methods would look something like:

>>> def reset(self):
...     if not self._cleaned:
...         self._clean()

(Their action on the callback from the tempdir would be to do whatever work was needed and set self._cleaned.) This approach has the (perhaps) surprising effect that resetting the webserver may reset the DB - if the webserver were to be depending on tempdir.reset as a way to reset the webserver's state.

Another approach which is not currently implemented is to provide an object graph of dependencies and a reset mechanism that can traverse that, along with a separation between 'reset starting' and 'reset finishing' - the DB and webserver would both have their reset_starting methods called, then the tempdir would be reset, and finally the DB and webserver would have reset_finishing called.

Stock Fixtures

In addition to the Fixture, FunctionFixture and MethodFixture classes, fixtures includes a number of pre-canned fixtures. The API docs for fixtures will list the complete set of these, should the docs be out of date or not to hand. For the complete feature set of each fixture please see the API docs.

ByteStream

Trivial adapter to make a BytesIO (though it may in future auto-spill to disk for large content) and expose that as a detail object, for automatic inclusion in test failure descriptions. Very useful in combination with MonkeyPatch:

>>> fixture = fixtures.StringStream('my-content')
>>> fixture.setUp()
>>> with fixtures.MonkeyPatch('sys.something', fixture.stream):
...     pass
>>> fixture.cleanUp()

This requires the fixtures[streams] extra.

EnvironmentVariable

Isolate your code from environmental variables, delete them or set them to a new value:

>>> fixture = fixtures.EnvironmentVariable('HOME')

FakeLogger

Isolate your code from an external logging configuration - so that your test gets the output from logged messages, but they don't go to e.g. the console:

>>> fixture = fixtures.FakeLogger()

FakePopen

Pretend to run an external command rather than needing it to be present to run tests:

>>> from io import BytesIO
>>> fixture = fixtures.FakePopen(lambda _:{'stdout': BytesIO('foobar')})

LogHandler

Replace or extend a logger's handlers. The behavior of this fixture depends on the value of the nuke_handlers parameter: if true, the logger's existing handlers are removed and replaced by the provided handler, while if false the logger's set of handlers is extended by the provided handler:

>>> from logging import StreamHandler
>>> fixture = fixtures.LogHandler(StreamHandler())

MockPatchObject

Adapts mock.patch.object to be used as a fixture:

>>> class Fred:
...     value = 1
>>> fixture = fixtures.MockPatchObject(Fred, 'value', 2)
>>> with fixture:
...     Fred().value
2
>>> Fred().value
1

MockPatch

Adapts mock.patch to be used as a fixture:

>>> fixture = fixtures.MockPatch('subprocess.Popen.returncode', 3)

MockPatchMultiple

Adapts mock.patch.multiple to be used as a fixture:

>>> fixture = fixtures.MockPatchMultiple('subprocess.Popen', returncode=3)

MonkeyPatch

Control the value of a named Python attribute

>>> def fake_open(path, mode):
...     pass
>>> fixture = fixtures.MonkeyPatch('__builtin__.open', fake_open)

Note that there are some complexities when patching methods - please see the API documentation for details.

NestedTempfile

Change the default directory that the tempfile module places temporary files and directories in. This can be useful for containing the noise created by code which doesn't clean up its temporary files. This does not affect temporary file creation where an explicit containing directory was provided

>>> fixture = fixtures.NestedTempfile()

PackagePathEntry

Adds a single directory to the path for an existing Python package. This adds to the package.__path__ list. If the directory is already in the path, nothing happens, if it isn't then it is added on setUp and removed on cleanUp:

>>> fixture = fixtures.PackagePathEntry('package/name', '/foo/bar')

PythonPackage

Creates a python package directory. Particularly useful for testing code that dynamically loads packages/modules, or for mocking out the command line entry points to Python programs:

>>> fixture = fixtures.PythonPackage('foo.bar', [('quux.py', '')])

PythonPathEntry

Adds a single directory to sys.path. If the directory is already in the path, nothing happens, if it isn't then it is added on setUp and removed on cleanUp:

>>> fixture = fixtures.PythonPathEntry('/foo/bar')

Stream

Trivial adapter to expose a file-like object as a detail object, for automatic inclusion in test failure descriptions. StringStream and BytesStream provided concrete users of this fixture.

This requires the fixtures[streams] extra.

StringStream

Trivial adapter to make a StringIO (though it may in future auto-spill to disk for large content) and expose that as a detail object, for automatic inclusion in test failure descriptions. Very useful in combination with MonkeyPatch:

>>> fixture = fixtures.StringStream('stdout')
>>> fixture.setUp()
>>> with fixtures.MonkeyPatch('sys.stdout', fixture.stream):
...     pass
>>> fixture.cleanUp()

This requires the fixtures[streams] extra.

TempDir

Create a temporary directory and clean it up later:

>>> fixture = fixtures.TempDir()

The created directory is stored in the path attribute of the fixture after setUp.

TempHomeDir

Create a temporary directory and set it as $HOME in the environment:

>>> fixture = fixtures.TempHomeDir()

The created directory is stored in the path attribute of the fixture after setUp.

The environment will now have $HOME set to the same path, and the value will be returned to its previous value after tearDown.

Timeout

Aborts if the covered code takes more than a specified number of whole wall-clock seconds.

There are two possibilities, controlled by the gentle argument: when gentle, an exception will be raised and the test (or other covered code) will fail. When not gentle, the entire process will be terminated, which is less clean, but more likely to break hangs where no Python code is running.

Caution!

Only one timeout can be active at any time across all threads in a single process. Using more than one has undefined results. (This could be improved by chaining alarms.)

Note

Currently supported only on Unix because it relies on the alarm system call.

WarningsCapture

Capture warnings for later analysis:

>>> fixture = fixtures.WarningsCapture()

The captured warnings are stored in the captures attribute of the fixture after setUp.

WarningsFilter

Configure warnings filters during test runs:

>>> fixture = fixtures.WarningsFilter(
...     [
...         {
...             'action': 'ignore',
...             'message': 'foo',
...             'category': DeprecationWarning,
...         },
...     ]
... )

Order is important: entries closer to the front of the list override entries later in the list, if both match a particular warning.

Contributing

Fixtures has its project homepage on GitHub <https://github.com/testing-cabal/fixtures>.

License

Copyright (c) 2010, Robert Collins <[email protected]>

Licensed under either the Apache License, Version 2.0 or the BSD 3-clause license at the users choice. A copy of both licenses are available in the project source as Apache-2.0 and BSD. You may not use this file except in compliance with one of these two licences.

Unless required by applicable law or agreed to in writing, software distributed under these licenses is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the license you chose for the specific language governing permissions and limitations under that license.

fixtures's People

Contributors

alaski avatar allenap avatar cboylan avatar cjwatson avatar dankenigsberg avatar edwardbetts avatar frankban avatar freeekanayaka avatar hugovk avatar javacruft avatar jd avatar jelmer avatar jml avatar jugmac00 avatar mgorny avatar mindw avatar pkulev avatar rbtcollins avatar sdague avatar stephenfin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fixtures's Issues

Automate releases

When we merged #51, I included the ability to automate creation of releases via the pypa/gh-action-pypi-publish action.

release:
name: Upload release artifacts
runs-on: ubuntu-latest
needs: test
if: github.event_name == 'push'
steps:
- name: Checkout source code
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Set up Python 3.10
uses: actions/setup-python@v2
with:
python-version: "3.10"
- name: Install dependencies
run: python -m pip install build
- name: Build a binary wheel and a source tarball
run: python -m build --sdist --wheel --outdir dist/ .
- name: Publish distribution to PyPI
if: startsWith(github.ref, 'refs/tags')
uses: pypa/gh-action-pypi-publish@master
with:
password: ${{ secrets.PYPI_API_TOKEN }}

I'm using this for a couple of other projects like git-pw and sphinx-click, however, for that to work here, we need someone needs to create the relevant secret. I don't have access to secrets (think I need "Admin" privileges for that) nor am I an admin of the PyPI project, so I can't do this. @jelmer @cjwatson @rbtcollins could one of you tackle this? The steps are simply:

  1. Create a new PyPI token at here The scope should be Project: fixtures.
  2. Create a new secret here. It should be called PYPI_API_TOKEN and the value should be the token generated in step 1

Alternatively, I'm happy to handle this but I'll need the additional superpowers outlined above.

Once this is done, I'll kick on with a 4.0.0 release, which is long overdue.

Test regression on PyPy3.9

Fixtures 4.0.0 fails tests on PyPy3.9 7.3.9:

pypy3 run-test: commands[0] | make check
python -m testtools.run fixtures.test_suite
/usr/lib/pypy3.9/runpy.py:127: RuntimeWarning: 'testtools.run' found in sys.modules after import of package 'testtools', but prior to execution of 'testtools.run'; this may result in unpredictable behaviour
  warn(RuntimeWarning(msg))
Tests running...
======================================================================
ERROR: fixtures.tests._fixtures.test_monkeypatch.TestMonkeyPatch.test_patch_classmethod_with_classmethod
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/fixtures/fixtures/tests/_fixtures/test_monkeypatch.py", line 200, in test_patch_classmethod_with_classmethod
    cls, = C.foo_cls()
ValueError: too many values to unpack (expected 1)
======================================================================
FAIL: fixtures.tests._fixtures.test_monkeypatch.TestMonkeyPatch.test_patch_classmethod_with_boundmethod
----------------------------------------------------------------------
Failed expectation: {{{MismatchError: <class 'fixtures.tests._fixtures.test_monkeypatch.C'> is not None}}}
Failed expectation-1: {{{MismatchError: <class 'fixtures.tests._fixtures.test_monkeypatch.C'> is not None}}}

AssertionError: Forced Test Failure

Ran 134 tests in 2.050s
FAILED (failures=2)

This seems to be caused by the following commit:

commit fe830674abd4926d96d38f9992f3e31b00cd891a (HEAD, refs/bisect/bad)
Author: Stephen Finucane <[email protected]>
Date:   2021-02-25 12:37:42 +0100

    Fix tests on Python 3.9
    
    I'm not entirely sure this is correct, but it's the only thing I can
    find related to changes in classmethod in Python 3.9.
    
    Signed-off-by: Stephen Finucane <[email protected]>

python 3.9 test failures

python -m testtools.run fixtures.test_suite

/opt/python/3.9-dev/lib/python3.9/runpy.py:127: RuntimeWarning: 'testtools.run' found in sys.modules after import of package 'testtools', but prior to execution of 'testtools.run'; this may result in unpredictable behaviour

  warn(RuntimeWarning(msg))

Tests running...

======================================================================

ERROR: fixtures.tests._fixtures.test_monkeypatch.TestMonkeyPatch.test_patch_classmethod_with_boundmethod

----------------------------------------------------------------------

Traceback (most recent call last):

  File "/home/travis/build/graingert/fixtures/fixtures/tests/_fixtures/test_monkeypatch.py", line 223, in test_patch_classmethod_with_boundmethod

    slf, cls = C.foo_cls()

TypeError: bar_two_args() missing 1 required positional argument: 'arg'

======================================================================

ERROR: fixtures.tests._fixtures.test_monkeypatch.TestMonkeyPatch.test_patch_classmethod_with_classmethod

----------------------------------------------------------------------

Traceback (most recent call last):

  File "/home/travis/build/graingert/fixtures/fixtures/tests/_fixtures/test_monkeypatch.py", line 191, in test_patch_classmethod_with_classmethod

    cls, target_class = C.foo_cls()

ValueError: not enough values to unpack (expected 2, got 1)


`FakePopen` is not fully compatible with Python 3.7

Python 3.7 introduced the text parameter for Popen

FakePopen has not yet been adjusted to this

def __call__(self, args, bufsize=_unpassed, executable=_unpassed,
stdin=_unpassed, stdout=_unpassed, stderr=_unpassed,
preexec_fn=_unpassed, close_fds=_unpassed, shell=_unpassed,
cwd=_unpassed, env=_unpassed, universal_newlines=_unpassed,
startupinfo=_unpassed, creationflags=_unpassed):

FakeProcess doesn't respect universal_newlines

FakeProcess always returns stdout.getvalue() and stderr.getvalue() as they are. This is a problem when you're testing code that sometimes passe universal_newlines to the subprocess calls.

In Python 2.7, universal_newlines doesn't make much of a difference, but in Python 3, if you pass universal_newlines=True, you will get strings instead of bytes for stdout and stderr in the subprocess calls.kaasjrzrrzr

3.0.0 test failures

I see the following test failures on NetBSD-7.99.59/amd64 with python-3.6.0, testtools-2.2.0, mock-2.0.0, extras-1.0.0:

======================================================================
FAIL: test_capture_category (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_category
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 46, in test_capture_category
    self.assertEqual(len(categories), len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 7 != 0


======================================================================
FAIL: test_capture_message (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_message
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 34, in test_capture_message
    self.assertEqual(1, len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 1 != 0


======================================================================
FAIL: test_capture_reuse (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_reuse
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 27, in test_capture_reuse
    self.assertEqual(1, len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 1 != 0


======================================================================
FAIL: test_capture_category (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_category
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 46, in test_capture_category
    self.assertEqual(len(categories), len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 7 != 0


======================================================================
FAIL: test_capture_message (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_message
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 34, in test_capture_message
    self.assertEqual(1, len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 1 != 0


======================================================================
FAIL: test_capture_reuse (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_reuse
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 27, in test_capture_reuse
    self.assertEqual(1, len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 1 != 0


======================================================================
FAIL: test_capture_category (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_category
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 46, in test_capture_category
    self.assertEqual(len(categories), len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 7 != 0


======================================================================
FAIL: test_capture_message (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_message
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 34, in test_capture_message
    self.assertEqual(1, len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 1 != 0


======================================================================
FAIL: test_capture_reuse (fixtures.tests._fixtures.test_warnings.TestWarnings)
fixtures.tests._fixtures.test_warnings.TestWarnings.test_capture_reuse
----------------------------------------------------------------------
testtools.testresult.real._StringException: Traceback (most recent call last):
  File "/scratch/devel/py-fixtures/work/fixtures-3.0.0/fixtures/tests/_fixtures/test_warnings.py", line 27, in test_capture_reuse
    self.assertEqual(1, len(w.captures))
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 411, in assertEqual
    self.assertThat(observed, matcher, message)
  File "/usr/pkg/lib/python3.6/site-packages/testtools/testcase.py", line 498, in assertThat
    raise mismatch_error
testtools.matchers._impl.MismatchError: 1 != 0


----------------------------------------------------------------------
Ran 894 tests in 16.352s

FAILED (failures=9)
Test failed: <unittest.runner.TextTestResult run=894 errors=0 failures=9>
error: Test failed: <unittest.runner.TextTestResult run=894 errors=0 failures=9>

From the output I can't really recognize what the errors are.

Multiple test failures with Python 3.11.0b1

The following tests are failing against Python 3.11.0b1:

$ tox -e py311
/tmp/fixtures/.tox/py311/lib/python3.11/site-packages/setuptools/command/easy_install.py:144: EasyInstallDeprecationWarning: easy_install command is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
/tmp/fixtures/.tox/py311/lib/python3.11/site-packages/setuptools/command/install.py:34: SetuptoolsDeprecationWarning: setup.py install is deprecated. Use build and pip and other standards-based tools.
  warnings.warn(
py311 develop-inst-noop: /tmp/fixtures
py311 installed: docutils==0.18.1,extras==1.0.0,-e git+https://github.com/testing-cabal/fixtures/@4a30fe9c8f81ab6d1133055b3c1830e4d86c8518#egg=fixtures,mock==4.0.3,pbr==5.9.0,testtools==2.5.0
py311 run-test-pre: PYTHONHASHSEED='1983342538'
py311 run-test: commands[0] | python -m testtools.run fixtures.test_suite
<frozen runpy>:128: RuntimeWarning: 'testtools.run' found in sys.modules after import of package 'testtools', but prior to execution of 'testtools.run'; this may result in unpredictable behaviour
Tests running...
======================================================================
ERROR: fixtures.tests._fixtures.test_monkeypatch.TestMonkeyPatch.test_patch_classmethod_with_classmethod
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/fixtures/fixtures/tests/_fixtures/test_monkeypatch.py", line 203, in test_patch_classmethod_with_classmethod
    cls, = C.foo_cls()
    ^^^^
ValueError: too many values to unpack (expected 1)
======================================================================
FAIL: fixtures.tests._fixtures.test_monkeypatch.TestMonkeyPatch.test_patch_classmethod_with_boundmethod
----------------------------------------------------------------------
Failed expectation: {{{MismatchError: <class 'fixtures.tests._fixtures.test_monkeypatch.C'> is not None}}}
Failed expectation-1: {{{MismatchError: <class 'fixtures.tests._fixtures.test_monkeypatch.C'> is not None}}}

AssertionError: Forced Test Failure
======================================================================
FAIL: fixtures.tests._fixtures.test_popen.TestFakePopen.test_function_signature
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/tmp/fixtures/fixtures/tests/_fixtures/test_popen.py", line 145, in test_function_signature
    self.assertSetEqual(
    ^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/unittest/case.py", line 1101, in assertSetEqual
    self.fail(self._formatMessage(msg, standardMsg))
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/lib/python3.11/unittest/case.py", line 671, in fail
    raise self.failureException(msg)
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AssertionError: Items in the second set but not the first:
'process_group' : Function signature of FakePopen doesn't match subprocess.Popen

Ran 139 tests in 2.032s
FAILED (failures=3)
ERROR: InvocationError for command /tmp/fixtures/.tox/py311/bin/python -m testtools.run fixtures.test_suite (exited with code 1)
_______________________________________________________________ summary _______________________________________________________________
ERROR:   py311: commands failed

Cyclical dependency on testtools

This package does need to have testtools in the requirements.txt

testtools has fixtures in its requirements.txt and this creates unnecessary confusion.

Create testing-cabal organization on PyPI

(Filing this here in absence of a meta repo for all things testing-cabal-related)

Should we set up a new organization on PyPI for testing-cabal related projects (i.e. everything in this GitHub organization). The main advantage I see is increased bus factor when it comes to releasing projects like testrepository.

A request can be filed at https://pypi.org/manage/organizations/. I can happily file the request myself but I would likely need to be an organization admin for it to have any sway. Given I'm already included in fixtures and use this stuff full-time in my OpenStack work, that would be a-okay by me but it's not my decision, naturally.

CC: @jelmer

3.0.0: pytest warnings

+ PYTHONPATH=/home/tkloczko/rpmbuild/BUILDROOT/python-fixtures-3.0.0-23.fc35.x86_64/usr/lib64/python3.8/site-packages:/home/tkloczko/rpmbuild/BUILDROOT/python-fixtures-3.0.0-23.fc35.x86_64/usr/lib/python3.8/site-packages
+ /usr/bin/python3 -Bm pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.9, pytest-6.2.3, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/fixtures-3.0.0
plugins: forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, timeout-1.4.2, betamax-0.8.1, pyfakefs-4.4.0, freezegun-0.4.2, cases-3.4.6, hypothesis-6.10.1, case-1.5.3, isort-1.3.0, aspectlib-1.5.2, flaky-3.7.0
collected 130 items

fixtures/tests/test_callmany.py ...                                                                                                                                  [  2%]
fixtures/tests/test_fixture.py ...........................                                                                                                           [ 23%]
fixtures/tests/test_testcase.py ....                                                                                                                                 [ 26%]
fixtures/tests/_fixtures/test_environ.py ........                                                                                                                    [ 32%]
fixtures/tests/_fixtures/test_logger.py .............                                                                                                                [ 42%]
fixtures/tests/_fixtures/test_mockpatch.py ......                                                                                                                    [ 46%]
fixtures/tests/_fixtures/test_monkeypatch.py ...........ss................                                                                                           [ 69%]
fixtures/tests/_fixtures/test_packagepath.py ..                                                                                                                      [ 70%]
fixtures/tests/_fixtures/test_popen.py ...........                                                                                                                   [ 79%]
fixtures/tests/_fixtures/test_pythonpackage.py ...                                                                                                                   [ 81%]
fixtures/tests/_fixtures/test_pythonpath.py ..                                                                                                                       [ 83%]
fixtures/tests/_fixtures/test_streams.py .......                                                                                                                     [ 88%]
fixtures/tests/_fixtures/test_tempdir.py .......                                                                                                                     [ 93%]
fixtures/tests/_fixtures/test_temphomedir.py ..                                                                                                                      [ 95%]
fixtures/tests/_fixtures/test_timeout.py ...                                                                                                                         [ 97%]
fixtures/tests/_fixtures/test_warnings.py ...                                                                                                                        [100%]

============================================================================= warnings summary =============================================================================
../../../../../usr/lib64/python3.8/unittest/case.py:26
  /usr/lib64/python3.8/unittest/case.py:26: PytestCollectionWarning: cannot collect test class 'SkipTest' because it has a __init__ constructor (from: fixtures/tests/_fixtures/test_timeout.py)
    class SkipTest(Exception):

-- Docs: https://docs.pytest.org/en/stable/warnings.html
========================================================================= short test summary info ==========================================================================
SKIPPED [1] fixtures/tests/_fixtures/test_monkeypatch.py:217: Fails with Python 3.9
SKIPPED [1] fixtures/tests/_fixtures/test_monkeypatch.py:185: Fails with Python 3.9
================================================================ 128 passed, 2 skipped, 1 warning in 2.42s =================================================================
+ /usr/bin/python3 -m testtools.run fixtures.test_suite
/usr/lib64/python3.8/runpy.py:127: RuntimeWarning: 'testtools.run' found in sys.modules after import of package 'testtools', but prior to execution of 'testtools.run'; this may result in unpredictable behaviour
  warn(RuntimeWarning(msg))
Tests running...

Ran 128 tests in 2.031s

Asynchronous fixtures

I'd like something that's almost exactly like fixtures, except that I want to have _setUp and any cleanups I add return Deferreds.

I also therefore want to be able to useFixture on those asynchronous fixtures.

Left to my own devices, I think I would achieve this by factoring out the addCleanup logic that's already in Twisted's testing framework, and then create a parallel implementation of fixtures, along with an adapter that takes regular synchronous fixtures and makes them return Deferreds.

However, that's not optimal. I'd like something that works for various asynchronous abstractions, not just Twisted, and I'd like to have some means of avoiding interface skew.

(Was going to file on Launchpad but lost my 2FA token)

_setUp() convention no longer allows a fixture to skip a test

A fixture might be trying to grab a not-necessarily-available resource, like a database connection. The fixture should be able to invoke skipTest(). This was possible when overriding the setUp() method, but in the new _setUp() convention, all exception are caught including skip exceptions. Fixture should have a skip() method and setUp() should handle this special exception as a skip for the test overall.

testtools should be a test extra requirement

fixtures only needs testtools for its own test suite. If a user of fixtures isn't interested in running the Fixtures test suite, they shouldn't have to install testtools as well.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.