Comments (14)
Apparently, the compatibility issues weren't as big in v0.23.4 as they were initially in the v0.23 releases.
Thanks to the investigation by @mgorny and the subsequent PR submitted by @akeeman there's now a pre-release of pytest-asyncio with pytest 8 support.
Version v0.23.5a0 does not contain code changes compared to v0.23.4, apart from fixes to the type annotations. Unless there are any unexpected issues, I'll tag a proper patch release by the end of the week.
from pytest-asyncio.
Thanks. Unfortunately, pytest-asyncio doesn't properly limit the maximum pytest version.
Pytest 8 introduces changes to the collection phase. Since it's a bit more effort for pytest-asyncio to accommodate those, I plan to temporarily exclude pytest>=8 from the compatible versions in a (post?) release. This doesn't resolve the issue, but it will prevent users from installing incompatible versions of pytest and pytest-asyncio.
from pytest-asyncio.
@seifertm Could you please release 0.23.4? pytest 8.0 was just released, and users got errors while using pytest-asyncio, see #763
from pytest-asyncio.
There were no reports related to issues of pytest-asyncio-0.23.5a0 and pytest 8. There area lot of people wanting to upgrade and there were no significant changes required for pytest 8 support. Therefore, I tagged v0.23.5.
Thanks!
from pytest-asyncio.
@dolfinus @4danmi pytest-asyncio-0.23.4 is released. It contains a proper upper bound for the pytest version, so that it doesn't get installed with pytest 8.
from pytest-asyncio.
@seifertm Can we please release an hofix?
from pytest-asyncio.
pytest-asyncio v0.23.4a0 adds a maximum pytest version (pytest<8) to install_requires
in setup.cfg
. This should solve the issues in your CI that tests pre-release versions.
What's still left is to add actual support for pytest 8.
from pytest-asyncio.
Thanks!
For the record, Gentoo's been patching the pytest-8 support in since 2024-02-06, and we've had no issues reported so far.
from pytest-asyncio.
Understood, thank you!
from pytest-asyncio.
Confirmed, thank you! https://github.com/ipython/ipykernel/actions/runs/7468436406/job/20323873110
from pytest-asyncio.
@dolfinus @4danmi pytest-asyncio-0.23.4 is released. It contains a proper upper bound for the pytest version, so that it doesn't get installed with pytest 8.
Seems like that fix is breaking in the conda-forge feedstock, so 0.23.4 is not yet available on conda-forge:
I think that this restriction also has to be added to the conda-forge recipe.
from pytest-asyncio.
Are things really that bad, though? From a quick run of the test suite, I'm seeing two test failures but I've tested some revdeps already and they all worked without problems. Perhaps 8.0.0 final is more compatible than the RCs.
Test output
$ make test
coverage run --parallel-mode --omit */_version.py -m pytest
========================================================= test session starts =========================================================
platform linux -- Python 3.11.7, pytest-8.0.0, pluggy-1.3.0
rootdir: /tmp/pytest-asyncio
configfile: setup.cfg
testpaths: docs/source, tests
plugins: asyncio-0.23.5.dev10+ge92efad, hypothesis-6.96.2
asyncio: mode=Mode.AUTO
collected 166 items
docs/source/concepts_function_scope_example.py . [ 0%]
docs/source/concepts_module_scope_example.py .. [ 1%]
docs/source/how-to-guides/class_scoped_loop_example.py .. [ 3%]
docs/source/how-to-guides/module_scoped_loop_example.py .. [ 4%]
docs/source/how-to-guides/multiple_loops_example.py .. [ 5%]
docs/source/reference/fixtures/event_loop_example.py . [ 6%]
docs/source/reference/fixtures/event_loop_policy_example.py . [ 6%]
docs/source/reference/fixtures/event_loop_policy_parametrized_example.py .. [ 7%]
docs/source/reference/markers/class_scoped_loop_custom_policies_strict_mode_example.py .. [ 9%]
docs/source/reference/markers/class_scoped_loop_strict_mode_example.py .. [ 10%]
docs/source/reference/markers/class_scoped_loop_with_fixture_strict_mode_example.py . [ 10%]
docs/source/reference/markers/function_scoped_loop_pytestmark_strict_mode_example.py . [ 11%]
docs/source/reference/markers/function_scoped_loop_strict_mode_example.py . [ 12%]
docs/source/reference/markers/module_scoped_loop_strict_mode_example.py ... [ 13%]
tests/async_fixtures/test_async_fixtures.py .. [ 15%]
tests/async_fixtures/test_async_fixtures_scope.py . [ 15%]
tests/async_fixtures/test_async_fixtures_with_finalizer.py .. [ 16%]
tests/async_fixtures/test_async_gen_fixtures.py ... [ 18%]
tests/async_fixtures/test_nested.py . [ 19%]
tests/async_fixtures/test_parametrized_loop.py . [ 19%]
tests/hypothesis/test_base.py ....... [ 24%]
tests/loop_fixture_scope/test_loop_fixture_scope.py .. [ 25%]
tests/markers/test_class_scope.py .......... [ 31%]
tests/markers/test_function_scope.py ...... [ 34%]
tests/markers/test_module_scope.py .......... [ 40%]
tests/markers/test_package_scope.py ......... [ 46%]
tests/markers/test_session_scope.py F.......... [ 53%]
tests/modes/test_auto_mode.py ...... [ 56%]
tests/modes/test_strict_mode.py ..... [ 59%]
tests/test_asyncio_fixture.py ...... [ 63%]
tests/test_asyncio_mark.py ....... [ 67%]
tests/test_dependent_fixtures.py .. [ 68%]
tests/test_doctest.py .. [ 69%]
tests/test_event_loop_fixture.py . [ 70%]
tests/test_event_loop_fixture_finalizer.py ...... [ 74%]
tests/test_event_loop_fixture_override_deprecation.py .... [ 76%]
tests/test_explicit_event_loop_fixture_request.py ....... [ 80%]
tests/test_import.py ... [ 82%]
tests/test_is_async_test.py ..F. [ 84%]
tests/test_multiloop.py . [ 85%]
tests/test_port_factories.py ...... [ 89%]
tests/test_simple.py .......... [ 95%]
tests/test_skips.py ....... [ 99%]
tests/test_subprocess.py . [100%]
============================================================== FAILURES ===============================================================
_____________________________________ test_asyncio_mark_provides_session_scoped_loop_strict_mode ______________________________________
/tmp/pytest-asyncio/tests/markers/test_session_scope.py:65: in test_asyncio_mark_provides_session_scoped_loop_strict_mode
result.assert_outcomes(passed=4)
E AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 0, ...}
E
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'passed': 3} != {'passed': 4}
E {'failed': 1} != {'failed': 0}
E Use -v to get more diff
-------------------------------------------------------- Captured stdout call ---------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.11.7, pytest-8.0.0, pluggy-1.3.0
rootdir: /tmp/pytest-of-mgorny/pytest-12/test_asyncio_mark_provides_session_scoped_loop_strict_mode0
plugins: asyncio-0.23.5.dev10+ge92efad, hypothesis-6.96.2
asyncio: mode=Mode.STRICT
collected 4 items
subpkg/test_subpkg.py F [ 25%]
test_module_one.py . [ 50%]
test_module_two.py .. [100%]
=================================== FAILURES ===================================
______________________ test_subpackage_runs_in_same_loop _______________________
async def test_subpackage_runs_in_same_loop():
> assert asyncio.get_running_loop() is shared_module.loop
E assert <_UnixSelectorEventLoop running=True closed=False debug=False> is None
E + where <_UnixSelectorEventLoop running=True closed=False debug=False> = <built-in function get_running_loop>()
E + where <built-in function get_running_loop> = asyncio.get_running_loop
E + and None = shared_module.loop
subpkg/test_subpkg.py:9: AssertionError
=========================== short test summary info ============================
FAILED subpkg/test_subpkg.py::test_subpackage_runs_in_same_loop - assert <_Un...
========================= 1 failed, 3 passed in 0.04s ==========================
____________________________________ test_returns_false_for_unmarked_coroutine_item_in_strict_mode ____________________________________
/tmp/pytest-asyncio/tests/test_is_async_test.py:81: in test_returns_false_for_unmarked_coroutine_item_in_strict_mode
result.assert_outcomes(skipped=1)
E AssertionError: assert {'errors': 0,...pped': 0, ...} == {'errors': 0,...pped': 1, ...}
E
E Omitting 4 identical items, use -vv to show
E Differing items:
E {'skipped': 0} != {'skipped': 1}
E {'failed': 1} != {'failed': 0}
E Use -v to get more diff
-------------------------------------------------------- Captured stdout call ---------------------------------------------------------
============================= test session starts ==============================
platform linux -- Python 3.11.7, pytest-8.0.0, pluggy-1.3.0
rootdir: /tmp/pytest-of-mgorny/pytest-12/test_returns_false_for_unmarked_coroutine_item_in_strict_mode0
plugins: asyncio-0.23.5.dev10+ge92efad, hypothesis-6.96.2
asyncio: mode=Mode.STRICT
collected 1 item
test_returns_false_for_unmarked_coroutine_item_in_strict_mode.py F [100%]
=================================== FAILURES ===================================
__________________________________ test_coro ___________________________________
cls = <class '_pytest.runner.CallInfo'>
func = <function call_runtest_hook.<locals>.<lambda> at 0x7fa402873740>
when = 'call'
reraise = (<class '_pytest.outcomes.Exit'>, <class 'KeyboardInterrupt'>)
@classmethod
def from_call(
cls,
func: Callable[[], TResult],
when: Literal["collect", "setup", "call", "teardown"],
reraise: Optional[
Union[Type[BaseException], Tuple[Type[BaseException], ...]]
] = None,
) -> "CallInfo[TResult]":
"""Call func, wrapping the result in a CallInfo.
:param func:
The function to call. Called without arguments.
:param when:
The phase in which the function is called.
:param reraise:
Exception or exceptions that shall propagate if raised by the
function, instead of being wrapped in the CallInfo.
"""
excinfo = None
start = timing.time()
precise_start = timing.perf_counter()
try:
> result: Optional[TResult] = func()
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/runner.py:345:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/runner.py:266: in <lambda>
lambda: ihook(item=item, **kwds), when=when, reraise=reraise
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_hooks.py:493: in __call__
return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_manager.py:115: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_manager.py:457: in traced_hookexec
return outcome.get_result()
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_manager.py:454: in <lambda>
lambda: oldcall(hook_name, hook_impls, caller_kwargs, firstresult)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/threadexception.py:87: in pytest_runtest_call
yield from thread_exception_runtest_hook()
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/threadexception.py:63: in thread_exception_runtest_hook
yield
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/unraisableexception.py:90: in pytest_runtest_call
yield from unraisable_exception_runtest_hook()
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/unraisableexception.py:65: in unraisable_exception_runtest_hook
yield
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/logging.py:839: in pytest_runtest_call
yield from self._runtest_for(item, "call")
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/logging.py:822: in _runtest_for
yield
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/capture.py:882: in pytest_runtest_call
return (yield)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/skipping.py:257: in pytest_runtest_call
return (yield)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/runner.py:181: in pytest_runtest_call
raise e
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/runner.py:173: in pytest_runtest_call
item.runtest()
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/python.py:1836: in runtest
self.ihook.pytest_pyfunc_call(pyfuncitem=self)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_hooks.py:493: in __call__
return self._hookexec(self.name, self._hookimpls, kwargs, firstresult)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_manager.py:115: in _hookexec
return self._inner_hookexec(hook_name, methods, kwargs, firstresult)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_manager.py:457: in traced_hookexec
return outcome.get_result()
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/pluggy/_manager.py:454: in <lambda>
lambda: oldcall(hook_name, hook_impls, caller_kwargs, firstresult)
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/python.py:190: in pytest_pyfunc_call
async_warn_and_skip(pyfuncitem.nodeid)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
nodeid = 'test_returns_false_for_unmarked_coroutine_item_in_strict_mode.py::test_coro'
def async_warn_and_skip(nodeid: str) -> None:
msg = "async def functions are not natively supported and have been skipped.\n"
msg += (
"You need to install a suitable plugin for your async framework, for example:\n"
)
msg += " - anyio\n"
msg += " - pytest-asyncio\n"
msg += " - pytest-tornasync\n"
msg += " - pytest-trio\n"
msg += " - pytest-twisted"
> warnings.warn(PytestUnhandledCoroutineWarning(msg.format(nodeid)))
E pytest.PytestUnhandledCoroutineWarning: async def functions are not natively supported and have been skipped.
E You need to install a suitable plugin for your async framework, for example:
E - anyio
E - pytest-asyncio
E - pytest-tornasync
E - pytest-trio
E - pytest-twisted
/tmp/pytest-asyncio/.tox/py311/lib/python3.11/site-packages/_pytest/python.py:182: PytestUnhandledCoroutineWarning
=========================== short test summary info ============================
FAILED test_returns_false_for_unmarked_coroutine_item_in_strict_mode.py::test_coro
============================== 1 failed in 0.37s ===============================
=================================================== 2 failed, 164 passed in 21.83s ====================================================
make: *** [Makefile:24: test] Error 1
from pytest-asyncio.
@analog-cbarber I saw that you already opened conda-forge/pytest-asyncio-feedstock#41 and it looks your issue has been resolved already.
The feedstock is the correct place to report these kind of issues. I'm happy to help from the pytest-asyncio side, but none of the pytest-asyncio developers maintain the conda feedstock.
from pytest-asyncio.
@analog-cbarber I saw that you already opened conda-forge/pytest-asyncio-feedstock#41 and it looks your issue has been resolved already.
The feedstock is the correct place to report these kind of issues. I'm happy to help from the pytest-asyncio side, but none of the pytest-asyncio developers maintain the conda feedstock.
Probably a good idea for one of the pytest-asyncio maintainers to sign on to maintain the conda feedstock so you don't get surprised by this kind of thing. It usually is not that much work.
from pytest-asyncio.
Related Issues (20)
- file.py is marked to be run in an event loop with scope class, but is not part of any class. HOT 4
- pytest-asyncio makes pytest collect modules not matching `python_files`, causing `ModuleNotFoundError` HOT 3
- Startup failure on Windows with Version 0.23.3 HOT 1
- 0.23.3 breaking ImportError exception HOT 1
- documentation on how to force all tests in one session to use the same event loop appears to not force fixtures into the same event loop HOT 4
- TypeError: __call__() got an unexpected keyword argument 'specname' in version 0.23 pytest-asyncio HOT 3
- Python 3.11: DeprecationWarning: There is no current event loop HOT 13
- Hypothesis integration raises internal error when collecting `RuleBasedStateMachine` HOT 2
- AsyncGenerator early exit doesn't raise CancelledError and doesn't run `finally` branch HOT 4
- RuntimeError: "Timeout context manager should be used inside a task" in aiohttp with version 0.23 pytest-asyncio HOT 1
- pytest 8.0.0 AttributeError: 'Package' object has no attribute 'obj' HOT 2
- AttributeError: '_UnixSelectorEventLoop' object has no attribute '_compute_internal_coro' HOT 1
- Drop support for pytest 7
- Factory fixtures are causing `'Coroutine' object is not callable` warning in UI, but tests still run HOT 3
- Configuration option for default loop scope
- Can't async mock a method in a context manager HOT 2
- Parametrizing `event_loop_policy` parametrizes all tests
- Not compatible with --doctest-modules --doctest-ignore-import-errors HOT 1
- How to use event_loop_policy without triggering warnings? HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytest-asyncio.