Coder Social home page Coder Social logo

xolox / python-executor Goto Github PK

View Code? Open in Web Editor NEW
101.0 5.0 26.0 697 KB

Programmer friendly subprocess wrapper

Home Page: https://executor.readthedocs.org

License: MIT License

Makefile 0.75% Python 99.25%
python subprocess command-execution ssh-client ssh-server chroot schroot

python-executor's Introduction

executor: Programmer friendly subprocess wrapper

https://travis-ci.org/xolox/python-executor.svg?branch=master https://coveralls.io/repos/github/xolox/python-executor/badge.svg?branch=master

The executor package is a simple wrapper for Python's subprocess module that makes it very easy to handle subprocesses on UNIX systems with proper escaping of arguments and error checking:

  • An object oriented interface is used to execute commands using sane but customizable (and well documented) defaults.
  • Remote commands (executed over SSH) are supported using the same object oriented interface, as are commands inside chroots (executed using schroot).
  • There's also support for executing a group of commands concurrently in what's called a "command pool". The concurrency level can be customized and of course both local and remote commands are supported.

The package is currently tested on Python 2.7, 3.5, 3.6, 3.7, 3.8 and PyPy. For usage instructions please refer to following sections and the documentation.

The executor package is available on PyPI which means installation should be as simple as:

$ pip install executor

There's actually a multitude of ways to install Python packages (e.g. the per user site-packages directory, virtual environments or just installing system wide) and I have no intention of getting into that discussion here, so if this intimidates you then read up on your options before returning to these instructions ;-).

There are two ways to use the executor package: As the command line program executor and as a Python API. The command line interface is described below and there are also some examples of simple use cases of the Python API.

Usage: executor [OPTIONS] COMMAND ...

Easy subprocess management on the command line based on the Python package with the same name. The "executor" program runs external commands with support for timeouts, dynamic startup delay (fudge factor) and exclusive locking.

You can think of "executor" as a combination of the "flock" and "timelimit" programs with some additional niceties (namely the dynamic startup delay and integrated system logging on UNIX platforms).

Supported options:

Option Description
-t, --timeout=LIMIT Set the time after which the given command will be aborted. By default LIMIT is counted in seconds. You can also use one of the suffixes "s" (seconds), "m" (minutes), "h" (hours) or "d" (days).
-f, --fudge-factor=LIMIT This option controls the dynamic startup delay (fudge factor) which is useful when you want a periodic task to run once per given interval but the exact time is not important. Refer to the --timeout option for acceptable values of LIMIT, this number specifies the maximum amount of time to sleep before running the command (the minimum is zero, otherwise you could just include the command "sleep N && ..." in your command line :-).
-e, --exclusive Use an interprocess lock file to guarantee that executor will never run the external command concurrently. Refer to the --lock-timeout option to customize blocking / non-blocking behavior. To customize the name of the lock file you can use the --lock-file option.
-T, --lock-timeout=LIMIT By default executor tries to claim the lock and if it fails it will exit with a nonzero exit code. This option can be used to enable blocking behavior. Refer to the --timeout option for acceptable values of LIMIT.
-l, --lock-file=NAME Customize the name of the lock file. By default this is the base name of the external command, so if you're running something generic like "bash" or "python" you might want to change this :-).
-v, --verbose Increase logging verbosity (can be repeated).
-q, --quiet Decrease logging verbosity (can be repeated).
-h, --help Show this message and exit.

Below are some examples of how versatile the execute() function is. Refer to the API documentation on Read the Docs for (a lot of) other use cases.

By default the status code of the external command is returned as a boolean:

>>> from executor import execute
>>> execute('true')
True

If an external command exits with a nonzero status code an exception is raised, this makes it easy to do the right thing (never forget to check the status code of an external command without having to write a lot of repetitive code):

>>> execute('false')
Traceback (most recent call last):
  File "executor/__init__.py", line 124, in execute
    cmd.start()
  File "executor/__init__.py", line 516, in start
    self.wait()
  File "executor/__init__.py", line 541, in wait
    self.check_errors()
  File "executor/__init__.py", line 568, in check_errors
    raise ExternalCommandFailed(self)
executor.ExternalCommandFailed: External command failed with exit code 1! (command: bash -c false)

The ExternalCommandFailed exception exposes command and returncode attributes. If you know a command is likely to exit with a nonzero status code and you want execute() to simply return a boolean you can do this instead:

>>> execute('false', check=False)
False

Here's how you can provide input to an external command:

>>> execute('tr a-z A-Z', input='Hello world from Python!\n')
HELLO WORLD FROM PYTHON!
True

Getting the output of external commands is really easy as well:

>>> execute('hostname', capture=True)
'peter-macbook'

It's also very easy to execute commands with super user privileges:

>>> execute('echo test > /etc/hostname', sudo=True)
[sudo] password for peter: **********
True
>>> execute('hostname', capture=True)
'test'

If you're wondering how prefixing the above command with sudo would end up being helpful, here's how it works:

>>> import logging
>>> logging.basicConfig()
>>> logging.getLogger().setLevel(logging.DEBUG)
>>> execute('echo peter-macbook > /etc/hostname', sudo=True)
DEBUG:executor:Executing external command: sudo bash -c 'echo peter-macbook > /etc/hostname'

To run a command on a remote system using SSH you can use the RemoteCommand class, it works as follows:

>>> from executor.ssh.client import RemoteCommand
>>> cmd = RemoteCommand('localhost', 'echo $SSH_CONNECTION', capture=True)
>>> cmd.start()
>>> cmd.output
'127.0.0.1 57255 127.0.0.1 22'

The foreach() function wraps the RemoteCommand and CommandPool classes to make it very easy to run a remote command concurrently on a group of hosts:

>>> from executor.ssh.client import foreach
>>> from pprint import pprint
>>> hosts = ['127.0.0.1', '127.0.0.2', '127.0.0.3', '127.0.0.4']
>>> commands = foreach(hosts, 'echo $SSH_CONNECTION')
>>> pprint([cmd.output for cmd in commands])
['127.0.0.1 57278 127.0.0.1 22',
 '127.0.0.1 52385 127.0.0.2 22',
 '127.0.0.1 49228 127.0.0.3 22',
 '127.0.0.1 40628 127.0.0.4 22']

The latest version of executor is available on PyPI and GitHub. The documentation is hosted on Read the Docs and includes a changelog. For bug reports please create an issue on GitHub. If you have questions, suggestions, etc. feel free to send me an e-mail at [email protected].

This software is licensed under the MIT license.

© 2020 Peter Odding.

python-executor's People

Contributors

djmattyg007 avatar eyjhb avatar hlaf avatar keuko avatar xolox avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

python-executor's Issues

Control over the spinner when using concurrency

It would be nice to have some level of control over the spinner that is displayed when using concurrency. For example, I would like to be able to have it re-render the spinner only once every 1-2 seconds, rather than constantly.

RemoteCommand(): 'unicode' object is not callable

Hello,

I'm attempting to modify the example "Running remote commands" to utilize RSA key logins. I understood that this argument is documented under RemoteCommand()

>>> cmd = RemoteCommand('100.88.x.x', 'echo $SSH_CONNECTION', capture=True, ssh_user="ubuntu", identity_file="/home/sprive/key.pem")
>>> cmd.start()
>>> cmd.start()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'unicode' object is not callable

The above happened once, and hasn't repeated, but as I'm considering this to replace my own wrappers around Paramiko, I would like to understand...

I tried a second time in a new Python session:

>>> from executor.ssh.client import RemoteCommand
>>> cmd = RemoteCommand('100.88.x.x', 'echo $SSH_CONNECTION', ssh_user="ubuntu", identity_file="/home/sprive/key.pem")
>>> cmd.start()
100.88.x.x 43130 100.88.y.y 22
Connection to 100.88.y.y closed.
>>> cmd.output()
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
TypeError: 'NoneType' object is not callable
>>>

Some success but connection closes before I can call output().

I've repeated above =using other commands, like 'hostname' and 'whoami', with same pattern (closes after start() ).

The environment is EC2, if that matters. From that source PC (as user sprive) I can do a Bash one-liner to SSH to ubuntu@target and get successful output:

$ ssh [email protected] -i /home/sprive/key.pem "whoami"
ubuntu

Oddly, if I replace the command with 'echo $SSH_CONNECTION' I get empty output. However the connection is actually . I can show this by changing the command to 'echo $SSH_CONNECTION; touch /tmp/xyz' then checking /tmp/xyz. I don't really think this is relevant, but since the example uses $SSH_CONNECTION as indication of success I am mentioning it.

Thoughts?

Binary garbage in /etc/lsb-release (on Travis CI)

While implementing a fix for #10 I found a really weird edge case that (so far) has only manifested in the Python 2.6 environment on Travis CI: The parsing of /etc/lsb-release results in the expected variable names but the values contain binary garbage including nul bytes. Here's a failing Travis CI build where the following log message can be observed (hard wrapped for readability):

Extracted 4 variables from /etc/lsb-release:
{u'DISTRIB_CODENAME': 't\x00\x00\x00r\x00\x00\x00u\x00\x00\x00s\x00\x00\x00t\x00\x00\x00y\x00\x00\x00',
 u'DISTRIB_RELEASE': '1\x00\x00\x004\x00\x00\x00.\x00\x00\x000\x00\x00\x004\x00\x00\x00',
 u'DISTRIB_ID': 'U\x00\x00\x00b\x00\x00\x00u\x00\x00\x00n\x00\x00\x00t\x00\x00\x00u\x00\x00\x00',
 u'DISTRIB_DESCRIPTION': '\x00\x00\x00U\x00\x00\x00b\x00\x00\x00u\x00\x00\x00n\x00\x00\x00t\x00\x00\x00u\x00\x00\x00 \x00\x00\x001\x00\x00\x004\x00\x00\x00.\x00\x00\x000\x00\x00\x004\x00\x00\x00.\x00\x00\x005\x00\x00\x00 \x00\x00\x00L\x00\x00\x00T\x00\x00\x00S\x00\x00\x00\x00\x00\x00'}

At first this seemed a minor annoyance in the sense that the test suite failed on this and that needed a workaround (see d3451a9). However since then I found that this also has an adverse effect on the test suite of another project of mine which uses executor.

This has left me wondering whether this weird edge case is isolated to the Python 2.6 environment on Travis CI or whether this is a symptom of a bigger problem, possibly a programming error on my side. The weird thing is that the Python 2.7, 3.4, 3.5, 3.6 and PyPy builds all work fine...

In the short term I'm worried that I may have broken the distributor_id and distribution_codename logic for more environments then just the Python 2.6 environment on Travis CI, so I'm going to add a "countermeasure" of sorts to executor.

use distro python module instead of lsb_release binary

I've tried to use apt-mirror-updater from a vanilla Ubuntu 18.04 docker image to set the fastest apt mirrors. I've encountered an error because python-executor relies on lsb_release binary to fetch the distribution codename and id. Here's the exception traceback:

Traceback (most recent call last):
  File "/usr/local/lib/python2.7/dist-packages/apt_mirror_updater/cli.py", line 160, in main
    callback()
  File "/usr/local/lib/python2.7/dist-packages/apt_mirror_updater/__init__.py", line 425, in change_mirror
    new_mirror = self.best_mirror
  File "/usr/local/lib/python2.7/dist-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/usr/local/lib/python2.7/dist-packages/apt_mirror_updater/__init__.py", line 176, in best_mirror
    if self.release_is_eol:
  File "/usr/local/lib/python2.7/dist-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/usr/local/lib/python2.7/dist-packages/apt_mirror_updater/__init__.py", line 343, in release_is_eol
    if hasattr(self.backend, 'get_eol_date'):
  File "/usr/local/lib/python2.7/dist-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/usr/local/lib/python2.7/dist-packages/apt_mirror_updater/__init__.py", line 164, in backend
    return sys.modules[module_path]
  File "/usr/local/lib/python2.7/dist-packages/apt_mirror_updater/__init__.py", line 164, in backend
    return sys.modules[module_path]

I've debugged the value of self.distribution_id and self.distribution_codename and both are empty.

Since there's no obvious way to enforce the existence of lsb_release inside a python module definition, I suggest using python distro module to get that information.

Fail in official Debian docker image ( contains neither /etc/lsb_release nor lsb-release installed)

Similar to issue #10 , but this time the official Debian docker image debian:buster, (a.k.a debian:10, also debian:latest at the moment) contains neither /etc/lsb_release nor lsb-release installed. Here's the exception traceback:

Traceback (most recent call last):
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/cli.py", line 160, in main
    callback()
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/cli.py", line 179, in report_available_mirrors
    have_bandwidth = any(c.bandwidth for c in updater.ranked_mirrors)
  File "/home/.local/lib/python2.7/site-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/__init__.py", line 277, in ranked_mirrors
    mirrors = sorted(self.available_mirrors, key=lambda c: c.sort_key, reverse=True)
  File "/home/.local/lib/python2.7/site-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/__init__.py", line 125, in available_mirrors
    if self.release_is_eol:
  File "/home/.local/lib/python2.7/site-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/__init__.py", line 341, in release_is_eol
    logger.debug("Checking whether %s is EOL ..", self.release)
  File "/home/.local/lib/python2.7/site-packages/property_manager/__init__.py", line 781, in __get__
    value = super(custom_property, self).__get__(obj, type)
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/__init__.py", line 310, in release
    return coerce_release(self.distribution_codename)
  File "/home/.local/lib/python2.7/site-packages/apt_mirror_updater/releases.py", line 114, in coerce_release
    raise ValueError(msg % value)
ValueError: The string u'' doesn't match a known Debian or Ubuntu release!

So I think using /etc/lsb_release or lsb-release is not robust nowadays, I suggest using string in /etc/apt/sources.list as my fork of apt-mirror-updater, please see commit

Types

Hi there, would you be open to a PR adding mypy types to this library?

Environment variable can not be set

If you need to customize environment variable on a per-call basis of execute, you will need to manage that outside on the caller.

Allow environment variable to be set when calling execute method as a named argument like in the following example:

from executor import execute
execute(
    'somescript.sh',
    env=dict(
        CONF='/etc/someconf.ini',
    ),
)

runtime processing of stderr from a command

Hi,

I need to run an external command and at the same time to process the output as it is generated.

In this setting, the external command is the producer (with async=True and capture_stderr=True), and my program is the consumer. The stderr should be consumed by a function in a thread that pulls from the shared resource until the command finishes (is_finished=True).

In my first attempt, I passed a StringIO to ExternalCommand.stderr_file. My idea was to keep everything in memory, without external files. This does not work as ExternalCommand tries to get the fileno from the file object. StringIO does not support that... so exception. I looked at the code of CachedStream and I couldn't see an obvious reason why to retrive the fileno and file name from the file object. Wouldn't be better to keep a local copy of the file object and write on it?

In my second attempt, I thought to use a FIFO file. However, here there is another shortcoming. The built-in function open returns such a file object only when the producer opens the same FIFO for writing operations. Until then, the open blocks the consumer. This prevents me to pass the file object to ExternalCommand.stderr_file. An alternative could be os.open that allows to open with the flag os.O_NONBLOCK, but this won't work as the returned descriptor is an int.

Do you have any idea on how to go around that? I would like to avoid to use regular files and actually I'd prefer to keep everything in memory. StringIO is actually my favorite one, but FIFO are also fine.

Python 3.7 reserved word: async

https://docs.python.org/3.7/reference/lexical_analysis.html?highlight=reserved%20word#keywords

I didn't see any other issues mentioning this Python 3.7 change. Something to be aware of, async will be reserved in Python 3.7.

[I] (xolox.python-executor-cz6TGcXB)  xolox.python-executor-cz6TGcXB~/g/xolox.python-executor  mastertox -e py37
GLOB sdist-make: /home/ntangsurat/git/xolox.python-executor/setup.py
py37 create: /home/ntangsurat/git/xolox.python-executor/.tox/py37
py37 installdeps: -rrequirements-tests.txt
py37 inst: /home/ntangsurat/git/xolox.python-executor/.tox/dist/executor-19.0.zip
py37 installed: -f ~/.pypi/,coloredlogs==9.0,coverage==4.5.1,executor==19.0,fasteners==0.14.1,humanfriendly==4.8,mock==2.0.0,monotonic==1.4,pbr==3.1.1,property-manager==2.2,py==1.5.2,pytest==3.2.5,pytest-cov==2.5.1,six==1.11.0,verboselogs==1.7,virtualenv==15.1.0
py37 runtests: PYTHONHASHSEED='3646219831'
py37 runtests: commands[0] | py.test
================================================================================ test session starts ================================================================================
platform linux -- Python 3.7.0b2, pytest-3.2.5, py-1.5.2, pluggy-0.4.0 -- /home/ntangsurat/git/xolox.python-executor/.tox/py37/bin/python3.7
cachedir: .cache
rootdir: /home/ntangsurat/git/xolox.python-executor, inifile: tox.ini
plugins: cov-2.5.1
collected 0 items / 1 errors                                                                                                                                                         

====================================================================================== ERRORS =======================================================================================
________________________________________________________________________ ERROR collecting executor/tests.py _________________________________________________________________________
.tox/py37/lib/python3.7/site-packages/_pytest/python.py:395: in _importtestmodule
    mod = self.fspath.pyimport(ensuresyspath=importmode)
.tox/py37/lib/python3.7/site-packages/py/_path/local.py:668: in pyimport
    __import__(modname)
E     File "/home/ntangsurat/git/xolox.python-executor/executor/__init__.py", line 199
E       if command.async:
E                      ^
E   SyntaxError: invalid syntax
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
============================================================================== 1 error in 0.16 seconds ==============================================================================
ERROR: InvocationError: '/home/ntangsurat/git/xolox.python-executor/.tox/py37/bin/py.test'
______________________________________________________________________________________ summary ______________________________________________________________________________________
ERROR:   py37: commands failed

compatibility with asyncio

Hi!

Is there a way to use executor in the context of async awaits with python's asyncio library? I'm aware of executor.ssh.client.foreach, but in my case I need to call a lot of remote commands via many nested functions, so foreach is not easily applicable here. It would be awesome to be able to write

async def get_data(target):
    cmd = RemoteCommand(target, 'somecommand')
    cmd.start()
    await cmd.wait()
    return cmd.output

Thank you for this incredibly helpful library!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.