Coder Social home page Coder Social logo

pytest-docker's Introduction

Docker-based integration tests

PyPI version Build Status Python versions Code style

Description

Simple pytest fixtures that help you write integration tests with Docker and Docker Compose. Specify all necessary containers in a docker-compose.yml file and and pytest-docker will spin them up for the duration of your tests.

pytest-docker was originally created by André Caron.

Installation

Install pytest-docker with pip or add it to your test requirements.

By default, it uses the docker compose command, so it relies on the Compose plugin for Docker (also called Docker Compose V2).

Docker Compose V1 compatibility

If you want to use the old docker-compose command (deprecated since July 2023, not receiving updates since 2021) then you can do it using the docker-compose-command fixture:

@pytest.fixture(scope="session")
def docker_compose_command() -> str:
    return "docker-compose"

If you want to use the pip-distributed version of docker-compose command, you can install it using

pip install pytest-docker[docker-compose-v1]

Another option could be usage of compose-switch.

Usage

Here is an example of a test that depends on a HTTP service.

With a docker-compose.yml file like this (using the httpbin service):

version: '2'
services:
  httpbin:
    image: "kennethreitz/httpbin"
    ports:
      - "8000:80"

You can write a test like this:

import pytest
import requests

from requests.exceptions import ConnectionError


def is_responsive(url):
    try:
        response = requests.get(url)
        if response.status_code == 200:
            return True
    except ConnectionError:
        return False


@pytest.fixture(scope="session")
def http_service(docker_ip, docker_services):
    """Ensure that HTTP service is up and responsive."""

    # `port_for` takes a container port and returns the corresponding host port
    port = docker_services.port_for("httpbin", 80)
    url = "http://{}:{}".format(docker_ip, port)
    docker_services.wait_until_responsive(
        timeout=30.0, pause=0.1, check=lambda: is_responsive(url)
    )
    return url


def test_status_code(http_service):
    status = 418
    response = requests.get(http_service + "/status/{}".format(status))

    assert response.status_code == status

By default this plugin will try to open docker-compose.yml in your tests directory. If you need to use a custom location, override the docker_compose_file fixture inside your conftest.py file:

import os
import pytest


@pytest.fixture(scope="session")
def docker_compose_file(pytestconfig):
    return os.path.join(str(pytestconfig.rootdir), "mycustomdir", "docker-compose.yml")

Available fixtures

By default the scope of the fixtures are session but can be changed with pytest command line option --container-scope <scope>:

pytest --container-scope <scope> <test_directory>

For available scopes and descriptions see https://docs.pytest.org/en/6.2.x/fixture.html#fixture-scopes

docker_ip

Determine the IP address for TCP connections to Docker containers.

docker_compose_file

Get an absolute path to the docker-compose.yml file. Override this fixture in your tests if you need a custom location.

docker_compose_project_name

Generate a project name using the current process PID. Override this fixture in your tests if you need a particular project name.

docker_services

Start all services from the docker compose file (docker-compose up). After test are finished, shutdown all services (docker-compose down).

docker_compose_command

Docker Compose command to use to execute Dockers. Default is to use Docker Compose V2 (command is docker compose). If you want to use Docker Compose V1, change this fixture to return docker-compose.

docker_setup

Get the list of docker_compose commands to be executed for test spawn actions. Override this fixture in your tests if you need to change spawn actions. Returning anything that would evaluate to False will skip this command.

docker_cleanup

Get the list of docker_compose commands to be executed for test clean-up actions. Override this fixture in your tests if you need to change clean-up actions. Returning anything that would evaluate to False will skip this command.

Development

Use of a virtual environment is recommended. See the venv package for more information.

First, install pytest-docker and its test dependencies:

pip install -e ".[tests]"

Run tests with

pytest -c setup.cfg

to make sure that the correct configuration is used. This is also how tests are run in CI.

Use black with default settings for formatting. You can also use pylint with setup.cfg as the configuration file as well as mypy for type checking.

Contributing

This pytest plug-in and its source code are made available to you under a MIT license. It is safe to use in commercial and closed-source applications. Read the license for details!

Found a bug? Think a new feature would make this plug-in more practical? We welcome issues and pull requests!

When creating a pull request, be sure to follow this projects conventions (see above).

pytest-docker's People

Contributors

afoerster avatar andrelouiscaron avatar augi avatar bettercallbene avatar dtrifiro avatar gdetrez avatar grant-zietsman avatar hsheth2 avatar jacklinke avatar johnvillalovos avatar kianmeng avatar languitar avatar lexi-k avatar lukas-bednar avatar luminaar avatar mickael-mounier avatar n1ngu avatar otetard avatar raddessi avatar skshetry avatar vschmidt94 avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pytest-docker's Issues

Docker in docker

I have managed to implement pytest-docker to work locally with localstack - It's been great.

I would like some advice on what is considered best practice when running pytest-docker in a CI job. I'm specially working on GitLab, which does allow for docker in docker. However, it's proving difficult to implement. Any suggestions from past experiences would be helpful. Thanks 😊

import `docker-compose` and `docker-py` rather than fork subprocesses

Hi, this plugin looks really cool! After briefly looking over the code and some of the issues on this project I thought I'd pop in and point out that many of the problems (not having access to service logs for example) would be simplified if it were possible to have direct access to the objects exposed in the docker-compose or docker-py library APIs. Specifically, you should be able to do almost everything you can do from the command line using this class which should provide the following benefits for this codebase:

  • allow you and pytest-docker plugin users to programmatically access service and container info
  • allow you to dispense with subprocess management in your code and focus on implementing new ways to interact with docker in pytest-idiomatic ways

Even something as simple as a more generic version of this would be fairly helpful. Here I use the docker-py library to start up container, yield it as the test fixture object, then tear it down post-yield. In an approach like this providing access to service/container logs as requested in #13 could be a configurable behavior that occurs post-yield.

I hope you find my suggestions helpful. I would like to try my hand at implementing this if you don't have time or don't want to yourself.

Unique container names are bad

I know I can override container names, but I think the default implementation is bad.

Currently containers are named "pytest{}".format(os.getpid()). This leads to the following problems:

  • If test runs fail or are interrupted (ctrl-c), containers may stay around. Due to the unique names they start to accumulate. Same for images.
  • Docker-compose doesn't recognize that things belong to each other, and has conflicts. For example, if you use a static network subnet, then pytest-docker fails as different runs try to use the same network, but it's named separately.

This cost me days to debug (no docker wiz here).

Is there a specific reason why to use the pid-based naming? I would follow simply the behavior of docker-compose:

@pytest.fixture(scope='session')
def docker_compose_project_name(pytestconfig):
    """ Generate a project name using the projects root directory.

    Override this fixture in your tests if you need a particular project name.
    """
    return "{}pytest".format(os.path.basename(str(pytestconfig.rootdir)))

add --remove-orphans to docker-compose command?

I got the following error while using the plugin. Does it make sense to add --remove-orphans to the docker-compose command?

Exception: Command 'docker-compose -f "<hidden>" -p "pytest7" down -v' returned 1: """Stopping <hidden> ... 
E           Stopping <hidden>      ... 
E           
Stopping <hidden> ... done
Stopping <hidden>      ... done
Found orphan containers (<hidden>, <hidden>, <hidden>) for this project. If you removed or renamed this service in your compose file, you can run this command with the --remove-orphans flag to clean it up.

Swarm support

Currently, the code is fixed on using docker compose for managing services. Is there any interest in developing Swarm support as an addition? I'd probably chip in on that, since this might be in the best interesting of the project I'm currently handling.

implement pytest plugin support directly in docker-compose itself

A thought that occurs to me is that it might be a good idea to add pytest features directly to docker-compose itself. Advantages, assuming docker-compose maintainers would allow it, include the following:

  • Improve visibility of these fixtures in the python and docker-compose communities, thereby making it more likely to get new features and bug fix support.
  • Make it possible through test coverage in that project to prevent fixture breakage due to breaking upstream library api changes; users of this project may be put in the unfortunate position of wanting new docker-compose features but not being able to upgrade it in their projects' virtualenvs because newer versions of docker-compose may behavior here.

Support for pytest v7

There is a constraint for pytest disallowing v7 which was released a few days back:

pytest >=4.0, <7.0

This is preventing us from updating to the newest version. It looks like nothing should be affected for pytest-docker and can just be relaxed to <8.0.

How to wait for container to become healthy?

Hi I'm trying to use this plugin with a docker-compose.yml file like this:

version: "3.4"                                                                          
services:                                                                               
  db:                                                                                   
    image: postgis/postgis:14-3.2-alpine                                                
    environment:                                                                        
      - POSTGRES_DB=foo                                                            
      - POSTGRES_USER=postgres                                                          
      - POSTGRES_PASSWORD=postgres                                                      
    healthcheck:                                                                        
      test: ["CMD", "nc", "-z", "localhost", "5432"]                                    
      interval: 1s                                                                      
      timeout: 1m                                                                       
      retries: 60                                                                                                                                   
  web:                                                                                  
    image: my-freshly-built-web-app                                      
    depends_on:                                                                         
      db:                                                                               
        condition: service_healthy                                                      
    environment:                                                                        
      - POSTGRES_DB=foo                                                            
      - POSTGRES_USER=postgres                                                          
      - POSTGRES_PASSWORD=postgres                                                      
      - POSTGRES_HOST=db                                                                
      - POSTGRES_PORT=5432                                                                                                                     
    healthcheck:                                                                        
      test: ["CMD", "nc", "-z", "localhost", "8000"]                                    
      interval: 1s                                                                      
      timeout: 1m                                                                       
      retries: 60                                                                       
    ports:                                                                              
      - "8000:8000"  

It seems to me that docker_services fixture becomes available when all the services specified in the docker-compose.yml have started.

However, if the web container is slow (needs to init the db, etc.), the web container is not actually ready to serve on the port and tests fail.

My current work-around is to add another, dummy container that depends on web being healthy...

Insufficient removal of control characters on windows with cmd.exe as shell

Affected Line

endpoint = output.strip().decode("utf-8")

endpoint = output.strip().decode("utf-8")
Affected Version: at least since 0.10.3

I use PyCharm on Windows 10 to debug my pytest test cases. Starting the conatiners the above line is executed to to resolve a port for a service.

output = self._docker_compose.execute("port %s %d" % (service, container_port)) results to b'0.0.0.0:12347\r\n\x1b[0m'. No control character is removed. Hence, output.strip().decode("utf-8") result to s string including the control characters.

Some lines later (L86) if len(endpoint.split("\n")) > 1: does not support windows line endings.

On Linux or Git Bash for Windows, there is no problem.

My fix for that. Please check this solution:

endpoint = endpoint.replace("\r",'') # add support for windows line ending
if len(endpoint.split("\n")) > 1:
      endpoint = endpoint.split("\n")[-1] # index -1 is also wrong for me because it results to '\x1b[0m'. Maybe use 0 or -2

`"docker-compose-v1": executable file not found in $PATH` when running from Makefile

Hey folks,

I'm having a strange issue where I get:

E           Exception: Command docker-compose -f "/Users/ciaran/dev/ctk/chaostoolkit-lib/tests/test_exit/docker-compose.yaml" -p "pytest27587" up --build -d returned 1: """exec: "docker-compose-v1": executable file not found in $PATH
E           Current PATH : 
E           """.

If I run my pytest tests using pytest-docker from a Makefile target.

I'm using virtualenv for packages and I can run the tests fine like so:

(.venv) $ pytest 
# All pass fine

But if I call my Makefile target which looks like:

.PHONY: tests
tests:
	pytest
(.venv) $ make tests
# This moans about docker-compose and the $PATH

Does anyone have any ideas as to why this might be breaking?

I notice that which docker-compose gives me different outputs:

When run from outside of make:

/usr/local/bin/docker-compose

When run inside make:

/Users/ciaran/dev/ctk/chaostoolkit-lib/.venv/bin/docker-compose

I understand that in the docs you recommend installing docker-compose as a python package, but it seems like that's done regardless upon installing pytest-docker. Either way though, I'm still unsure as to why running pytest outside of make works and inside it doesn't 🙁

Any help is appreciated!

Unable to run in CI pipeline

Hi, I am wondering if anyone tried to use lib this in a CI/CD pipeline.

This is my gitlab-ci job

integration-tests:
  stage: test
  image: python:3.8.3
  before_script:
    - pip install pytest==6.2.3 pytest-docker==0.10.3 requests==2.25.1
  script:
    - pytest tests/integration_tests/ -v -s

My test works in local environment, but in the CI job it seems like it can't detect docker. I even tried added in manual installation for docker & docker-compose but the result is still the same.

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
391command = 'docker-compose -f "/builds/recommendation-engine/product_association/docker-compose-test.yml" -p "pytest5314" up --build -d'
392success_codes = (0,)
393    def execute(command, success_codes=(0,)):
394        """Run a shell command."""
395        try:
396            output = subprocess.check_output(command, stderr=subprocess.STDOUT, shell=True)
397            status = 0
398        except subprocess.CalledProcessError as error:
399            output = error.output or b""
400            status = error.returncode
401            command = error.cmd
402    
403        if status not in success_codes:
404>           raise Exception(
405                'Command {} returned {}: """{}""".'.format(
406                    command, status, output.decode("utf-8")
407                )
408            )
409E           Exception: Command docker-compose -f "/builds/imda_dsl/ai-shopfloor/retail-recommendation-engine/product_association/docker-compose-test.yml" -p "pytest5314" up --build -d returned 1: """Traceback (most recent call last):
410E             File "/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 699, in urlopen
411E               httplib_response = self._make_request(
412E             File "/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 394, in _make_request
413E               conn.request(method, url, **httplib_request_kw)
414E             File "/usr/local/lib/python3.8/http/client.py", line 1240, in request
415E               self._send_request(method, url, body, headers, encode_chunked)
416E             File "/usr/local/lib/python3.8/http/client.py", line 1286, in _send_request
417E               self.endheaders(body, encode_chunked=encode_chunked)
418E             File "/usr/local/lib/python3.8/http/client.py", line 1235, in endheaders
419E               self._send_output(message_body, encode_chunked=encode_chunked)
420E             File "/usr/local/lib/python3.8/http/client.py", line 1006, in _send_output
421E               self.send(msg)
422E             File "/usr/local/lib/python3.8/http/client.py", line 946, in send
423E               self.connect()
424E             File "/usr/local/lib/python3.8/site-packages/docker/transport/unixconn.py", line 30, in connect
425E               sock.connect(self.unix_socket)
426E           FileNotFoundError: [Errno 2] No such file or directory
427E           
428E           During handling of the above exception, another exception occurred:
429E           
430E           Traceback (most recent call last):
431E             File "/usr/local/lib/python3.8/site-packages/requests/adapters.py", line 439, in send
432E               resp = conn.urlopen(
433E             File "/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 755, in urlopen
434E               retries = retries.increment(
435E             File "/usr/local/lib/python3.8/site-packages/urllib3/util/retry.py", line 532, in increment
436E               raise six.reraise(type(error), error, _stacktrace)
437E             File "/usr/local/lib/python3.8/site-packages/urllib3/packages/six.py", line 769, in reraise
438E               raise value.with_traceback(tb)
439E             File "/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 699, in urlopen
440E               httplib_response = self._make_request(
441E             File "/usr/local/lib/python3.8/site-packages/urllib3/connectionpool.py", line 394, in _make_request
442E               conn.request(method, url, **httplib_request_kw)
443E             File "/usr/local/lib/python3.8/http/client.py", line 1240, in request
444E               self._send_request(method, url, body, headers, encode_chunked)
445E             File "/usr/local/lib/python3.8/http/client.py", line 1286, in _send_request
446E               self.endheaders(body, encode_chunked=encode_chunked)
447E             File "/usr/local/lib/python3.8/http/client.py", line 1235, in endheaders
448E               self._send_output(message_body, encode_chunked=encode_chunked)
449E             File "/usr/local/lib/python3.8/http/client.py", line 1006, in _send_output
450E               self.send(msg)
451E             File "/usr/local/lib/python3.8/http/client.py", line 946, in send
452E               self.connect()
453E             File "/usr/local/lib/python3.8/site-packages/docker/transport/unixconn.py", line 30, in connect
454E               sock.connect(self.unix_socket)
455E           urllib3.exceptions.ProtocolError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
456E           
457E           During handling of the above exception, another exception occurred:
458E           
459E           Traceback (most recent call last):
460E             File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 214, in _retrieve_server_version
461E               return self.version(api_version=False)["ApiVersion"]
462E             File "/usr/local/lib/python3.8/site-packages/docker/api/daemon.py", line 181, in version
463E               return self._result(self._get(url), json=True)
464E             File "/usr/local/lib/python3.8/site-packages/docker/utils/decorators.py", line 46, in inner
465E               return f(self, *args, **kwargs)
466E             File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 237, in _get
467E               return self.get(url, **self._set_request_timeout(kwargs))
468E             File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 555, in get
469E               return self.request('GET', url, **kwargs)
470E             File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 542, in request
471E               resp = self.send(prep, **send_kwargs)
472E             File "/usr/local/lib/python3.8/site-packages/requests/sessions.py", line 655, in send
473E               r = adapter.send(request, **kwargs)
474E             File "/usr/local/lib/python3.8/site-packages/requests/adapters.py", line 498, in send
475E               raise ConnectionError(err, request=request)
476E           requests.exceptions.ConnectionError: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
477E           
478E           During handling of the above exception, another exception occurred:
479E           
480E           Traceback (most recent call last):
481E             File "/usr/local/bin/docker-compose", line 8, in <module>
482E               sys.exit(main())
483E             File "/usr/local/lib/python3.8/site-packages/compose/cli/main.py", line 81, in main
484E               command_func()
485E             File "/usr/local/lib/python3.8/site-packages/compose/cli/main.py", line 200, in perform_command
486E               project = project_from_options('.', options)
487E             File "/usr/local/lib/python3.8/site-packages/compose/cli/command.py", line 60, in project_from_options
488E               return get_project(
489E             File "/usr/local/lib/python3.8/site-packages/compose/cli/command.py", line 152, in get_project
490E               client = get_client(
491E             File "/usr/local/lib/python3.8/site-packages/compose/cli/docker_client.py", line 41, in get_client
492E               client = docker_client(
493E             File "/usr/local/lib/python3.8/site-packages/compose/cli/docker_client.py", line 170, in docker_client
494E               client = APIClient(use_ssh_client=not use_paramiko_ssh, **kwargs)
495E             File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 197, in __init__
496E               self._version = self._retrieve_server_version()
497E             File "/usr/local/lib/python3.8/site-packages/docker/api/client.py", line 221, in _retrieve_server_version
498E               raise DockerException(
499E           docker.errors.DockerException: Error while fetching server API version: ('Connection aborted.', FileNotFoundError(2, 'No such file or directory'))
500E           """.

Thank you.

__init__() got an unexpected keyword argument 'tls'

Just started getting the following exception when running on BitBucket pipelines.
This was working yesterday.

I'm going to investigate further, however i thought i'd put this here for anyone else who was hitting the same issue.

E           Exception: Command docker-compose -f "/opt/atlassian/pipelines/agent/build/api/tests/docker-compose.yml" -p "pytest841" up --build -d returned 1: """Traceback (most recent call last):
E             File "/usr/local/bin/docker-compose", line 8, in <module>
E               sys.exit(main())
E             File "/usr/local/lib/python3.8/site-packages/compose/cli/main.py", line 72, in main
E               command()
E             File "/usr/local/lib/python3.8/site-packages/compose/cli/main.py", line 125, in perform_command
E               project = project_from_options('.', options)
E             File "/usr/local/lib/python3.8/site-packages/compose/cli/command.py", line 66, in project_from_options
E               return get_project(
E             File "/usr/local/lib/python3.8/site-packages/compose/cli/command.py", line 141, in get_project
E               client = get_client(
E             File "/usr/local/lib/python3.8/site-packages/compose/cli/docker_client.py", line 45, in get_client
E               client = docker_client(
E             File "/usr/local/lib/python3.8/site-packages/compose/cli/docker_client.py", line 141, in docker_client
E               context = Context("compose", host=host, tls=verify)
E           TypeError: __init__() got an unexpected keyword argument 'tls'
E           """.
/usr/local/lib/python3.8/site-packages/pytest_docker/plugin.py:24: Exception

Pytest hangs on custom docker compose file

I have a custom docker compose file, I redefine docker_compose_file fixture as instructed, but it hangs there on execution and never gets to actual test function. The path to compose file is correct.

Reproduce code:

@pytest.fixture(scope="session")
def docker_compose_file(pytestconfig):
    compose_path = os.path.join(str(pytestconfig.rootdir), "docker-compose.live.yml")
    print(compose_path)
    return compose_path


def test_docker(docker_services):
    print("test")
    print(docker_services.port_for("mediamtx", 8554))
    assert True

To see test stdout output run: pytest -vv -s

Option to keep containers running or use manually started ones

I have a setup where spinning up the containers for tests is quite expensive. It would be nice if there was an option to either leave contains running after the first execution or to use a project started manually beforehand. That way, the expensive boot up of the containers can be avoided for subsequent tests.

Performance impact of pytest/subprocess on docker services?

Hi all, just wanted to check if anyone else has seen any significant performance effects on container spin-up and runtime.

I recently switched from a plain shell script spinning up a container to docker-compose using pytest-docker. It's quite a heavy app (a GitLab CE container that does a full reconfigure on every startup), so it already took around 2 minutes to get up and running for tests before, but now it's more like 3 minutes (at least on underpowered CI VMs), and some time-sensitive tests seem to be struggling, so more wait-time might need to be added.

I understand that both pytest and subprocess do a lot more under the hood which probably contributes, but I didn't expect such a big difference, but maybe I'm also doing something wrong :)

Access/print services logs

It would be really useful if we could access the logs of the services so that we can print those logs in case of a test failure. Without this, debugging is really hard.

Any idea how to implement this?

Give access to services' environment variables?

I currently need to read the environment variables defined for my application in compose. Would be cool if there was a function for that in Services, don't you agree? I can write it.

Right now my implementation looks like this:

@pytest.fixture(scope='session')
def apps_environment(docker_compose_file):
    """Environment variables defined for the application in docker compose.
    """
    with open(docker_compose_file, 'r') as compose_file:
        docker_compose_content = yaml.load(compose_file.read())
    # 'api' is the name of the service
    raw_env = docker_compose_content['services']['api']['environment']
    env = {}
    for env_line in raw_env:
        variable_name, variable_value = env_line.split('=')
        env[variable_name] = variable_value
    return env

Oh, and thanks for the cool plugin :) Quite handy.

Docker host environment variable nested containers

Hi Team,

Suppose I have a container called pytest which I mount the docker.sock in and let it create the containers defined by docker-compose.yml.

I've found that only host.docker.internal works, by ignoring docker_ip fixture.

Is there any other way around here? As this is suitable for docker->docker communication - but not without more redundant code to swap the hostname out can my IDE pytest connect.

ModuleNotFoundError: No module named 'pkg_resources'

I'm unable to launch pytest with pytest-docker plugin which throws the following error, only major change I did was to reinstall the docker-compose as it was hanging

E           Exception: Command docker-compose -f "/Users/.../...../tests/it/docker-compose.yml" -p "pytest13507" up --build -d returned 1: """Traceback (most recent call last):
E             File "/usr/local/bin/docker-compose", line 6, in <module>
E               from pkg_resources import load_entry_point
E           ModuleNotFoundError: No module named 'pkg_resources'

The setuptools are installed and are latest and the pkg_resources is even available within the venv from where the tests are being launched

(venv) ➜  it git:(IT) ✗ python
Python 3.7.7 (default, Mar 10 2020, 15:43:03) 
[Clang 11.0.0 (clang-1100.0.33.17)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> from pkg_resources import load_entry_point

Accessing services from within a container running on docker

Hey guys,
I have a particular use case, i have my entire env running on docker and i would like to access the docker host from with the dev container to launch some services required for my tests.

I wonder if there is any way using pytest-docker or any good suggestions, if not i think that feature will be worth considerable.

How to add parametrize to our test case with pytest-docker?

Hi, I am wondering if it is possible to use @pytest.mark.parametrize together?

In the given example, the http_service fixture is already used, so I'm not sure how to add the parametrize variables in, so that I can send multiple requests to the the dockerized API.

def test_status_code(http_service):
    status = 418
    response = requests.get(http_service + "/status/{}".format(status))

    assert response.status_code == status

Endpoint empty error

On a windows system, the endpoint command returns a weird string:

0.0.0.0:9997\r\n�[0m

That's why the strip here does not change the string, and this line takes the garbage part. I have solved it by using a regexp:

ips = re.findall(r'[0-9]+(?:\.[0-9]+){3}:[0-9]+', endpoint)
assert len(ips) == 1
endpoint = ips[0]

Do you think this is a possible solution that could be merged in?

Allow for overriding of pytest fixture scope

Being able to override the pytest fixtures to have a "function" scope instead of "session" scope would be useful when using containers that may carry state between tests.

An example of a useful configuration (though hard-coded): Greenlight-Analytical@102eedb

Some boilerplate to connect a flag:

def pytest_addoption(parser):
    parser.addoption(
        "--keep-containers",
        action="store_true",
        help="Keep containers between tests. This is faster but may cause test failures from leftover state between tests.",
    )


@pytest.fixture
def keep_containers(request):
    return request.config.getoption("--keep-containers")


def keep_containers_scope(fixture_name, config):
    if config.getoption("--keep-containers", None):
        return "session"
    return "function"


@pytest.fixture(scope=keep_containers_scope)
# This annotation can apply to all fixtures, instead of the hard-coded "session" scope

Fallback still starts the container with compose?

My impression from the wording around docker_allow_fallback would be that if it returns True it doesn't try to run anything with Compose at all and just returns localhost:port, where port comes from the port mapping in the docker-compose.yaml file for that service.

However, looking at https://github.com/AndreLouisCaron/pytest-docker/blob/master/src/pytest_docker/__init__.py#L169-L175 it seems that when fallback returns True but docker ps still works successfully it happily goes on and still ends up doing L178, spinning up the containers through compose. It seems that only when fallback is True and docker ps fails it does what I would expect.

This seems very strange to me. If I say it's supposed to use the fallback, why still check if docker is available and then ignore what the user configured?

I would expect the code like this instead:

    if docker_allow_fallback is True:
            # Run against localhost
            yield Services(docker_compose, docker_allow_fallback=True)
            return

Version Conflict with PyYAML 6.0

Hello

I would like to use pytest-docker in my library. However, I also need PyYAML with version 6.0. poetry is used as package manager. When I now install pytest-docker and PyYAML, I get a poetry SolverProblemError:

p@ubuntu:~/repos/testsrepos/dockerpytest$ poetry add pyyaml pytest-docker
Using version ^6.0 for PyYAML
Using version ^0.10.3 for pytest-docker

Updating dependencies
Resolving dependencies... (0.5s)

  SolverProblemError

  Because no versions of docker-compose match >1.27.3,<1.27.4 || >1.27.4,<1.28.0 || >1.28.0,<1.28.2 || >1.28.2,<1.28.3 || >1.28.3,<1.28.4 || >1.28.4,<1.28.5 || >1.28.5,<1.28.6 || >1.28.6,<1.29.0 || >1.29.0,<1.29.1 || >1.29.1,<1.29.2 || >1.29.2,<2.0
   and docker-compose (1.27.3) depends on PyYAML (>=3.10,<6), docker-compose (>=1.27.3,<1.27.4 || >1.27.4,<1.28.0 || >1.28.0,<1.28.2 || >1.28.2,<1.28.3 || >1.28.3,<1.28.4 || >1.28.4,<1.28.5 || >1.28.5,<1.28.6 || >1.28.6,<1.29.0 || >1.29.0,<1.29.1 || >1.29.1,<1.29.2 || >1.29.2,<2.0) requires PyYAML (>=3.10,<6).
  And because docker-compose (1.27.4) depends on PyYAML (>=3.10,<6)
   and docker-compose (1.28.0) depends on PyYAML (>=3.10,<6), docker-compose (>=1.27.3,<1.28.2 || >1.28.2,<1.28.3 || >1.28.3,<1.28.4 || >1.28.4,<1.28.5 || >1.28.5,<1.28.6 || >1.28.6,<1.29.0 || >1.29.0,<1.29.1 || >1.29.1,<1.29.2 || >1.29.2,<2.0) requires PyYAML (>=3.10,<6).
  And because docker-compose (1.28.2) depends on PyYAML (>=3.10,<6)
   and docker-compose (1.28.3) depends on PyYAML (>=3.10,<6), docker-compose (>=1.27.3,<1.28.4 || >1.28.4,<1.28.5 || >1.28.5,<1.28.6 || >1.28.6,<1.29.0 || >1.29.0,<1.29.1 || >1.29.1,<1.29.2 || >1.29.2,<2.0) requires PyYAML (>=3.10,<6).
  And because docker-compose (1.28.4) depends on PyYAML (>=3.10,<6)
   and docker-compose (1.28.5) depends on PyYAML (>=3.10,<6), docker-compose (>=1.27.3,<1.28.6 || >1.28.6,<1.29.0 || >1.29.0,<1.29.1 || >1.29.1,<1.29.2 || >1.29.2,<2.0) requires PyYAML (>=3.10,<6).
  And because docker-compose (1.28.6) depends on PyYAML (>=3.10,<6)
   and docker-compose (1.29.0) depends on PyYAML (>=3.10,<6), docker-compose (>=1.27.3,<1.29.1 || >1.29.1,<1.29.2 || >1.29.2,<2.0) requires PyYAML (>=3.10,<6).
  And because docker-compose (1.29.1) depends on PyYAML (>=3.10,<6)
   and docker-compose (1.29.2) depends on PyYAML (>=3.10,<6), docker-compose (>=1.27.3,<2.0) requires PyYAML (>=3.10,<6).
  Because no versions of pytest-docker match >0.10.3,<0.11.0
   and pytest-docker (0.10.3) depends on docker-compose (>=1.27.3,<2.0), pytest-docker (>=0.10.3,<0.11.0) requires docker-compose (>=1.27.3,<2.0).
  Thus, pytest-docker (>=0.10.3,<0.11.0) requires PyYAML (>=3.10,<6).
  So, because dockerpytest depends on both PyYAML (^6.0) and pytest-docker (^0.10.3), version solving failed.

Does anyone know how to use pytest-docker with PyYAML 6.0, or how to fix this bug?

Support for other forms of `$DOCKER_HOST` passing

There are two cases when docker_ip fixture returns incorrect value:

  1. Non-default docker context used.
  2. Remote docker used via ssh://... schema instead of tcp://.

I have an idea:

import pytest_docker, json, re, shlex, socket

@pytest.fixture(scope="session")
def docker_ip() -> str:
    # These two statements supports all forms of passing dockerd endpoint to docker (ezcept -H option):
    # 1. DOCKER_HOST variable
    # 2. DOCKER_CONTEXT variable
    # 3. docker context use
    # 4. Defaults to unix://...
    context_name = (
        (pytest_docker.plugin.execute("docker context show") or b"default")
        .decode("utf-8")
        .strip()
    )

    # This block is almost same as in origin 
    docker_host = pytest_docker.plugin.execute(
        'docker context inspect -f "{{.Endpoints.docker.Host}}" '
        + shlex.quote(context_name)
    ).decode("utf-8")
    if not docker_host or docker_host.startswith("unix://"):
        return "127.0.0.1"
    match = re.match(r"^tcp://(.+?):\d+$", docker_host)
    if match:
        return match.group(1)

    # Last-chance failover for non-tcp remote docker daemons
    info = json.loads(pytest_docker.plugin.execute("docker info -f json"))
    if info["Name"] == os.environ.get("HOSTNAME"):
        return "127.0.0.1"
    return info.get("Swarm",{}).get("NodeAddr") or socket.gethostbyname(info["Name"])

Other options are much worse:

  1. Parse ~/.ssh/config to detect chain of ProxyJump and ProxyCommand.
  2. Use something like execute("docker run --rm alpine ip route get 8.8.8.8") which will not work in private network.
  3. Require swam-enabled daemons and force docker info -f {{.Swarm.NodeAddr}}.

Maybe somebody knows better way to discover connectable address of remote docker daemon, but note that it can be non-connectable at all if ssh goes throw firewall or if it has --ip=127.0.0.1 in parameters.

BTW: One can configure proxy of tcp:// daemon (like Portainer does).

As a last resort user of some very special config can rewrite docker_ip fixture in conftest.py.

Tests broken because of calling fixture functions directly

It was causing warnings for some time and it looks like the newer versions of pytest are changing these into errors.

______________________________________ test_docker_services_failure ______________________________________
Fixture "docker_services" called directly. Fixtures are not meant to be called directly,
but are created automatically when test functions request them as parameters.
See https://docs.pytest.org/en/latest/fixture.html for more information about fixtures, and
https://docs.pytest.org/en/latest/deprecations.html#calling-fixtures-directly about how to update your code.

I have a way of fixing that, but I'm unsure that I should actually fix the tests, or just replace them with ones that don't rely on mocking that heavily. Please check out this PR - #34

TypeError: attrib() got an unexpected keyword argument 'convert'

Since version 17.4 attrs does not include convert as argument and your setup requirements does not add this constraint

How to reproduce

$ pip install attrs
$ pip install pytest
$ pip install pytest-docker
$ pip list
Package            Version
------------------ -------
attrs              19.3.0 
importlib-metadata 1.4.0  
more-itertools     8.1.0  
packaging          20.0   
pip                19.3.1 
pkg-resources      0.0.0  
pluggy             0.13.1 
py                 1.8.1  
pyparsing          2.4.6  
pytest             5.3.2  
pytest-docker      0.6.1  
setuptools         45.0.0 
six                1.13.0 
wcwidth            0.1.8  
wheel              0.33.6 
zipp               1.0.0  

$ pytest -- help 
...
  File ".venv/lib/python3.6/site-packages/pytest_docker/__init__.py", line 113, in DockerComposeExecutor
    _compose_files = attr.ib(convert=str_to_list)
TypeError: attrib() got an unexpected keyword argument 'convert'

Allow user to customize `docker-compose up` command line flags

When pytest is run, it launches the Docker Compose services with docker_compose.execute("up --build -d").

A problem with this is: for our application, we'd need to run docker_compose.execute("build && docker-compose up -d") to actually build it properly (because docker-compose --build up follows the depends_on order for the build, which doesn't work in our case).

I think adding a new fixture like docker_setup(), similar to docker_cleanup(), would be a great way of allowing this customization (as suggested here for a different issue).

Generated docker compose command fails

Getting the following error when using the docker_services fixture

Exception: Command docker compose -f "/tmp/pytest-of-xxx/pytest-1/docker-compose.yml" -p "pytest155540" up --build -d returned 125: """unknown shorthand flag: 'f' in -f

Using docker 20.10.17, could this be the problem?
Thanks

Project maintenance and activity

Hi,

Is this project still being actively developed or accepting PRs? It appears that there has been no merges for quite some time.

@Luminaar as the last active contributor, any chance you might be able to provide some guidance on this?

Cheers.

getting connection timeout error

Hi Team
I am using an HTTP service plugin to check minio service is up and running but I keep getting connection timeout error.

@pytest.fixture(scope="session")
def minio_service(docker_ip, docker_services):

    port = docker_services.port_for("minio", 9000)
    base_url = "http://{}:{}".format(docker_ip, port)
    docker_services.wait_until_responsive(
        timeout=30.0, pause=0.1, check=lambda: is_responsive(base_url + "/minio/health/live")
    )
    time.sleep(5)
    return "{}:{}".format(docker_ip, port)

def is_responsive(url):
    try:
        response = requests.get(url)
        logging.debug(response)
        if response.status_code == 200:
            return True
    except ConnectionError:
        return False

@pytest.fixture(scope="session")
def docker_compose_file(pytestconfig):
    """Point to the docker-compose as the Pybuilder layout is not working
    with the pytest-docker very well"""
    return os.path.join(str(pytestconfig.rootdir), "src/unittest/python", "docker-compose.yml")

I keep getting a connection timeout error.
output :

_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <http.client.HTTPResponse object at 0x129881b38>

    def _read_status(self):
        line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
        if len(line) > _MAXLINE:
            raise LineTooLong("status line")
        if self.debuglevel > 0:
            print("reply:", repr(line))
        if not line:
            # Presumably, the server closed the connection before
            # sending a valid response.
>           raise RemoteDisconnected("Remote end closed connection without"
                                     " response")
E           urllib3.exceptions.ProtocolError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

../../../../anaconda3/lib/python3.7/http/client.py:265: ProtocolError

During handling of the above exception, another exception occurred:

docker_ip = '127.0.0.1'
docker_services = Services(_docker_compose=DockerComposeExecutor(_compose_files=['/Users/akshay.uppalimanage.com/Desktop/MBIs/RI-173/com...ess/src/unittest/python/docker-compose.yml'], _compose_project_name='pytest85559'), _services={'minio': {9000: 32823}})

    @pytest.fixture(scope="session")
    def minio_service(docker_ip, docker_services):
        """Ensure that Minio service is up and responsive. Return the endpoint"""
    
        # `port_for` takes a container port and returns the corresponding host port
        port = docker_services.port_for("minio", 9000)
        base_url = "http://{}:{}".format(docker_ip, port)
        docker_services.wait_until_responsive(
>           timeout=5.0, pause=0.1, check=lambda: is_responsive(base_url + "/minio/health/live")
        )

src/unittest/python/conftest.py:44: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
venv/lib/python3.7/site-packages/pytest_docker/plugin.py:99: in wait_until_responsive
    if check():
src/unittest/python/conftest.py:44: in <lambda>
    timeout=5.0, pause=0.1, check=lambda: is_responsive(base_url + "/minio/health/live")
src/unittest/python/conftest.py:21: in is_responsive
    response = requests.get(url)
.pybuilder/plugins/cpython-3.7.3.final.0/lib/python3.7/site-packages/requests/api.py:76: in get
    return request('get', url, params=params, **kwargs)
.pybuilder/plugins/cpython-3.7.3.final.0/lib/python3.7/site-packages/requests/api.py:61: in request
    return session.request(method=method, url=url, **kwargs)
.pybuilder/plugins/cpython-3.7.3.final.0/lib/python3.7/site-packages/requests/sessions.py:530: in request
    resp = self.send(prep, **send_kwargs)
.pybuilder/plugins/cpython-3.7.3.final.0/lib/python3.7/site-packages/requests/sessions.py:643: in send
    r = adapter.send(request, **kwargs)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <requests.adapters.HTTPAdapter object at 0x12a7f9da0>, request = <PreparedRequest [GET]>, stream = False
timeout = Timeout(connect=None, read=None, total=None), verify = True, cert = None, proxies = OrderedDict()

    def send(self, request, stream=False, timeout=None, verify=True, cert=None, proxies=None):
        """Sends PreparedRequest object. Returns Response object.
    
        :param request: The :class:`PreparedRequest <PreparedRequest>` being sent.
        :param stream: (optional) Whether to stream the request content.
        :param timeout: (optional) How long to wait for the server to send
            data before giving up, as a float, or a :ref:`(connect timeout,
            read timeout) <timeouts>` tuple.
        :type timeout: float or tuple or urllib3 Timeout object
        :param verify: (optional) Either a boolean, in which case it controls whether
            we verify the server's TLS certificate, or a string, in which case it
            must be a path to a CA bundle to use
        :param cert: (optional) Any user-provided SSL certificate to be trusted.
        :param proxies: (optional) The proxies dictionary to apply to the request.
        :rtype: requests.Response
        """
    
        try:
            conn = self.get_connection(request.url, proxies)
        except LocationValueError as e:
            raise InvalidURL(e, request=request)
    
        self.cert_verify(conn, request.url, verify, cert)
        url = self.request_url(request, proxies)
        self.add_headers(request, stream=stream, timeout=timeout, verify=verify, cert=cert, proxies=proxies)
    
        chunked = not (request.body is None or 'Content-Length' in request.headers)
    
        if isinstance(timeout, tuple):
            try:
                connect, read = timeout
                timeout = TimeoutSauce(connect=connect, read=read)
            except ValueError as e:
                # this may raise a string formatting error.
                err = ("Invalid timeout {}. Pass a (connect, read) "
                       "timeout tuple, or a single float to set "
                       "both timeouts to the same value".format(timeout))
                raise ValueError(err)
        elif isinstance(timeout, TimeoutSauce):
            pass
        else:
            timeout = TimeoutSauce(connect=timeout, read=timeout)
    
        try:
            if not chunked:
                resp = conn.urlopen(
                    method=request.method,
                    url=url,
                    body=request.body,
                    headers=request.headers,
                    redirect=False,
                    assert_same_host=False,
                    preload_content=False,
                    decode_content=False,
                    retries=self.max_retries,
                    timeout=timeout
                )
    
            # Send the request.
            else:
                if hasattr(conn, 'proxy_pool'):
                    conn = conn.proxy_pool
    
                low_conn = conn._get_conn(timeout=DEFAULT_POOL_TIMEOUT)
    
                try:
                    low_conn.putrequest(request.method,
                                        url,
                                        skip_accept_encoding=True)
    
                    for header, value in request.headers.items():
                        low_conn.putheader(header, value)
    
                    low_conn.endheaders()
    
                    for i in request.body:
                        low_conn.send(hex(len(i))[2:].encode('utf-8'))
                        low_conn.send(b'\r\n')
                        low_conn.send(i)
                        low_conn.send(b'\r\n')
                    low_conn.send(b'0\r\n\r\n')
    
                    # Receive the response from the server
                    try:
                        # For Python 2.7, use buffering of HTTP responses
                        r = low_conn.getresponse(buffering=True)
                    except TypeError:
                        # For compatibility with Python 3.3+
                        r = low_conn.getresponse()
    
                    resp = HTTPResponse.from_httplib(
                        r,
                        pool=conn,
                        connection=low_conn,
                        preload_content=False,
                        decode_content=False
                    )
                except:
                    # If we hit any problems here, clean up the connection.
                    # Then, reraise so that we can handle the actual exception.
                    low_conn.close()
                    raise
    
        except (ProtocolError, socket.error) as err:
>           raise ConnectionError(err, request=request)
E           requests.exceptions.ConnectionError: ('Connection aborted.', RemoteDisconnected('Remote end closed connection without response'))

.pybuilder/plugins/cpython-3.7.3.final.0/lib/python3.7/site-packages/requests/adapters.py:498: ConnectionError

Executing multiple commands for `docker_setup` and `docker_cleanup`

Would there be interest in ability to easily run multiple commands on docker_setup and docker_cleanup, while maintaining backward compatibility? This would be trivial to implement in the plugin.

My use-case: want to run some additional commands on a container during setup, and I can imagine scenarios where running additional commands at cleanup would be helpful. Overriding these functions in a project seems like a lot more work for end-users than simply supplying all of the commands they want to execute.

Skipping the setup in Gitlab CI

Can we skip this from running based on an env var?
I want to use this to run tests on my local machine but I want to skip the docker-compose part when the tests are running in Gitlab CI.

same set of tests, many docker-compose.yml?

Hello

I have a bunch of things to test against several disparate environments. Several compose files, with several sets of environment variables. Ideally I would also test for failures to start containers, but that's a plus.

Now my python is a bit rusty, but I guess a docker_compose_file fixture with a class scope is what I need, or a parametrized fixture? Can I do that with the current pytest-docker, or should I just accept the idea of code duplication, or run pytest several times?

Thanks

Support for docker-compose V2

I guess the maintainers are already aware of all the context but I'll try to summarize it and still not bore anyone.

docker-compose V2 has been rewritten not in Python language and isn't distributed anymore as a python package.

In the V1 deprecation notice there is a commitment to fix security issues but no bugs or new features will ever be released https://github.com/docker/compose/tree/master#v1-vs-v2-transition-hourglass_flowing_sand

The V1 branch is effectively stale since V2 was first released 4 months ago.

The first signs of erosion are already showing up #70

Hopefully, the command line interface is still the only supported interface to docker-compose, and there is an explicit commitment to keep it stable and compatible, and by virtue of #19 we should have little to no trouble with the switch.

Effectively, after dropping the requirement on the docker-compose V1 python package, my first attempt to run pytest-docker test suite with my system's docker-compose 2.2.2 is a total success.

The main issue so far would be that: in common setups using a virtualenv the current hard dependency on docker-compose V1 will usually shadow any other version provided by the user. Or at least cause a confusion about what is actually being used (see #69 ?)

Therefore, aside from any future issue that could show up, the main challenges can be reduced to:

  • A breaking change in pytest-docker installation instructions
  • Convoluting the tox.ini matrix so that tests can be run for both docker-compose V1 and V2
  • Making docker-compose V2 available in the CI workflows.

Fixture `docker_ip` not found

Hello,

I'm creating an http_service fixture like the one featured in the readme. When I use the fixture, I'm getting fixture 'docker_ip' not found.

conftest.py

import json
import os
import pytest
import requests
import pathlib
from .utils import is_responsive_404

@pytest.fixture
def tests_dir():
    return pathlib.Path(__file__).resolve().parent


@pytest.fixture(scope="session")
def docker_compose_file(pytestconfig):
    return pathlib.Path(__file__).resolve().parent / "test-docker-compose.yml"


@pytest.fixture(scope="session")
def http_service(docker_ip, docker_services):
    """
    Ensure that Django service is up and responsive.
    """

    # `port_for` takes a container port and returns the corresponding host port
    port = docker_services.port_for("django", 8000)
    url = "http://{}:{}".format(docker_ip, port)
    url404 = f"{url}/missing"
    docker_services.wait_until_responsive(
        timeout=2400.0, pause=0.1, check=lambda: is_responsive_404(url404)
    )
    return url

tests file

import json
import pytest
import requests

test_endpoint = "api/"
test_headers = {"Content-Type": "application/json", "Accept": "application/json"}
status = 200

def test_item_list_up(http_service):
    """
        Test item list is up
    """
    response = requests.get(f"{http_service}/{test_endpoint}item/", headers=test_headers)
    resp_data = response.json()
    assert response.status_code == status
    assert resp_data.get('results', None) != None
    assert resp_data.get('pagination', {}).get('totalResults', -1) != -1

When I run pytest -vv I am getting:

ERROR at setup of test_item_list_up
file .../test_item_api.py, line 24
  def test_item_list_up(http_service):
file .../tests/conftest.py, line 18
  @pytest.fixture(scope="session")
  def http_service(docker_ip, docker_services):
E       fixture 'docker_ip' not found
>       available fixtures: anyio_backend, anyio_backend_name, anyio_backend_options, cache, capfd, capfdbinary, caplog, capsys, capsysbinary, class_mocker, docker_compose_file, doctest_namespace, http_service, mocker, module_mocker, monkeypatch, package_mocker, pytestconfig, record_property, record_testsuite_property, record_xml_attribute, recwarn, session_mocker, tests_dir, tmp_path, tmp_path_factory, tmpdir, tmpdir_factory
>       use 'pytest --fixtures [testpath]' for help on them.

Current pip venv:

Package            Version
------------------ ---------
attrs              21.4.0
bcrypt             4.0.1
certifi            2022.12.7
cffi               1.15.1
charset-normalizer 2.0.12
cryptography       40.0.1
distro             1.8.0
docker             6.0.1
docker-compose     1.29.2
dockerpty          0.4.1
docopt             0.6.2
exceptiongroup     1.1.1
idna               3.4
iniconfig          2.0.0
jsonschema         3.2.0
packaging          23.0
paramiko           3.1.0
pip                21.2.3
pluggy             1.0.0
py                 1.11.0
pycparser          2.21
PyNaCl             1.5.0
pyrsistent         0.19.3
pytest             6.2.5
pytest-docker      1.0.1
python-dotenv      0.21.1
PyYAML             5.4.1
requests           2.26.0
setuptools         57.4.0
six                1.16.0
texttable          1.6.7
toml               0.10.2
tomli              2.0.1
urllib3            1.26.15
websocket-client   0.59.0

limit services started by docker_services

docker_services fixture starts all containers mentioned in the docker-compose.yaml.
Is there some way to limit the services to some services? like docker_services(excluding_services: List)

When in a development setting, this would be useful as we normally want to test the service with hot-reload functionality.

I checked fixture-parameterize, but that is intended for running against all the parameters provided 1 by 1.

Rebuild the images

docker-compose can't always detect that an image needs to be rebuilt and sometime the tests fails as you're testing an image built with old files. My current workarround is to call docker-compose -f tests/docker-compose.yml build before running the tests. A better solution could be to add the --build option to docker-compose up in the fixture.

Support for Pytest 8

Pytest 8.0.0 is out! Let's include it in the test matrix and see if any changes are necessary.

Getting rid of tests that only test the implementation, and the path of future updates?

Hey @AndreLouisCaron! I think I finally have the time to show this project some love.

While I have some ideas for updates (e.g. #13), first I'd like to plow through the tests for this plugin itself. I think that they are tied to the implementation too much in most places, which shows in the ubiquitous mocking of subprocess.check_output. Because of that, they will all probably break with any serious refactoring.

I think we can have a 100% coverage with like two tests that actually run docker_services and a couple tests for discrete functions. That should demonstrate that the plugin is actually working.

I plan to submit a couple of self-contained but chained (one depending upon the other) pull requests. And, of course, I'll be mindful of not breaking anything for the current users. What do you think about this whole thing?

I'm already tackling the problems (pytest-dev/pytest#4193), but I wanted to hear your thoughts on that before I go far ahead.

different behaviour compared to local docker usage

Hello,
I want to test a postgres database, which runs in a docker container. Unfortunately, it works with a local docker instance running but not when I use the plugins pytest-docker and pytest-docker-compose. To present a minimal example I have two files.

  1. The docker-compose.yml file, which contains the following:
version: '2'
services:
  postgres:
    image: "postgres"
    ports:
      - "54320:5432"
    environment:
      - POSTGRES_PASSWORD=mysecretpass
      - POSTGRES_DB=immo
    container_name: test_postgres
  1. The necessary fixtures and the test are put in one file called test_mini.py, which is in the same directory as the docker-compose file . For convenience, I use the library sqlachemy_utils, which can be simply installed with pip. Additionally, the 'mainstream' database driver "psycopg2" is needed, which can also be installed with pip. The file contains:
import os
import pytest
from sqlalchemy_utils import database_exists


@pytest.fixture(scope="session")
def docker_compose_file(pytestconfig):
    return os.path.join(".", "docker-compose.yml")


@pytest.fixture(scope="session")
def db_url(docker_ip, docker_services):
    """ default database for testing """

    port = docker_services.port_for("postgres", 5432)
    return f'postgresql://postgres:mysecretpass@{docker_ip}:{port}/immo'


def test_db(db_url):
    assert database_exists(db_url)

The resulting error is:

E       sqlalchemy.exc.OperationalError: (psycopg2.OperationalError) server closed the connection unexpectedly
E       	This probably means the server terminated abnormally
E       	before or while processing the request.
E       
E       (Background on this error at: http://sqlalche.me/e/e3q8)

As stated above, this only becomes an error, if I use the pytest-docker plugin. Furthermore, if I put a debugging statement before the assert import pdb; pdb.set_trace() and simply continue c, the test suddenly succeeds. I have looked through the web, but couldn't find a solution. Do you have an idea? I would appreciate that very much.

Best

Utility to run arbitrary commands on a container

Most projects need some kind of initialization. So far I am copy-pasting this kind of scripts all over the projects

import pytest
import textwrap


@pytest.fixture(scope="session")
def backend_init(docker_services):

    def backend_run(command):
        docker_services._docker_compose.execute(
            "run --rm backend " + textwrap.dedent(command))

    backend_run("python -m django migrate")
    backend_run("python -m django configure")
    backend_run("""\
    bash <<EOF
    set -exo pipefail
    pip install pipenv
    pipenv install --dev --deploy
    python -m django load-fake-data
    EOF
    """)

    return

This sort of works, but I wonder if it would be worth to have some kind of utility to address this kind of need in pytest-docker.

Also, using docker_services._docker_compose.execute feels odd.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.