ansible-collections / cloud.common Goto Github PK
View Code? Open in Web Editor NEWCommon files for the Cloud collections
License: GNU General Public License v3.0
Common files for the Cloud collections
License: GNU General Public License v3.0
Dear maintainers,
This is important for your collections!
In accordance with the Community decision, we have created the news-for-maintainers repository for announcements of changes impacting collection maintainers (see the examples) instead of Issue 45 that will be closed soon.
Watch
button in the upper right corner on the repository's home page.Issues
.Also we would like to remind you about the Bullhorn contributor newsletter which has recently started to be released weekly. To learn what it looks like, see the past releases. Please subscribe and talk to the Community via Bullhorn!
Join us in #ansible-social (for news reporting & chat), #ansible-community (for discussing collection & maintainer topics), and other channels on Matrix/IRC.
Help the Community and the Steering Committee to make right decisions by taking part in discussing and voting on the Community Topics that impact the whole project and the collections in particular. Your opinion there will be much appreciated!
Thank you!
TURBO_MODE in cloud.common
We are using the turbbo_mode
together with an Ansible operator, written with the help of the operator-sdk. We have many 1000s of objects in our cluster, resulting in many parallel reconciliations and executions of the community.kubernetes
collection.
Activating the turbo_mode
helps to increase the performance drastically. After facing out of memory issues with the turbo_mode
(see operator-framework/operator-sdk#5246 (comment)), we needed to make a couple of modifications already though:
FileNotFoundError
and ultimately a raise
if the socket can't be created and / or is currently not existing (bbecause it was just termnated). Increasing the sleep (https://github.com/stiller-leser/cloud.common/blob/main/plugins/module_utils/turbo/common.py#L61) and increasing the TTL (https://github.com/stiller-leser/cloud.common/blob/main/plugins/module_utils/turbo/module.py#L142) via the environment variable helps somewhatAs we have many parallel reconciliations running, all calling the community.kubernetes
collection (hence all resulting in the same socket name, resulting in the conflicts described above), it would be great to enable the possibility to instead of raising here https://github.com/stiller-leser/cloud.common/blob/main/plugins/module_utils/turbo/common.py#L60 having the code fall back to a regular module execution.
Best regards,
stiller-leser
We are running sanity tests across every collection included in the Ansible community package (as part of this issue) and found that ansible-test sanity --docker
against cloud.common 2.1.1 fails with ansible-core 2.13.0rc1 in ansible 6.0.0a2.
n/a
ansible [core 2.13.0rc1]
2.1.1
ansible-test sanity --docker
Tests are either passing or ignored.
ERROR: Found 3 validate-modules issue(s) which need to be resolved:
ERROR: plugins/lookup/turbo_demo.py:0:0: invalid-documentation: DOCUMENTATION.lookup: extra keys not allowed @ data['lookup']. Got 'turbo_demo'
ERROR: plugins/lookup/turbo_demo.py:0:0: invalid-documentation: DOCUMENTATION.name: required key not provided @ data['name']. Got None
ERROR: plugins/lookup/turbo_demo.py:0:0: parameter-list-no-elements: DOCUMENTATION.options.playbook_vars: Argument defines type as list but elements is not defined for dictionary value @ data['options']['playbook_vars']. Got {'description': 'list of playbook variables to add in the output.', 'type': 'list'}
ERROR: The 1 sanity test(s) listed below (out of 43) failed. See error output above for details.
validate-modules
ERROR: Command "podman exec ansible-test-controller-uZ9MOqlH /usr/bin/env ANSIBLE_TEST_CONTENT_ROOT=/root/ansible_collections/cloud/common LC_ALL=en_US.UTF-8 /usr/bin/python3.10 /root/ansible/bin/ansible-test sanity --containers '{}' --skip-test pylint --metadata tests/output/.tmp/metadata-9kzi7jbv.json --truncate 0 --color no --host-path tests/output/.tmp/host-citpj576" returned exit status 1.
Unit tests need to be properly nested since ansible-test tool uses this information to determine what python versions to use when executing certain tests.
tests
Is there a way to enable Turbo Mode on all modules? Docs state Turbo Module can be enabled by importing
from ansible_module.turbo.module import AnsibleTurboModule as AnsibleModule
However, I'd like to speed up all Ansible modules across a run; see ansible/ansible#72184. Is a CLI option missing from the docs?
This hard-codes ~/.ansible/tmp/turbo_mode.socket
Can we please document what happens in the following scenarios?
A. My teammate and I invoke two Ansible playbooks on the same system at the same time (will they collide?)
B. I invoke a playbook that quickly uses two completely separate "cached" SDK client connections with different privileges (The daemon shuts down with 15 seconds of inactivity, but what if I'm running tasks faster than that?).
Based on the community decision to use true/false
for boolean values in documentation and examples, we ask that you evaluate booleans in this collection and consider changing any that do not use true/false
(lowercase).
See documentation block format for more info (specifically, option defaults).
If you have already implemented this or decide not to, feel free to close this issue.
P.S. This is auto-generated issue, please raise any concerns here
cloud.common
ansible [core 2.13.11]
config file = /Users/quatrava/.ansible.cfg
configured module search path = ['/Users/quatrava/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /Users/quatrava/Dev/ops/xaasible-ops/ansible-deps-cache/python-libs/lib/python/site-packages/ansible
ansible collection location = /Users/quatrava/Dev/ops/xaasible-ops/ansible-deps-cache
executable location = /Users/quatrava/Dev/ops/xaasible-ops/ansible-deps-cache/python-libs/bin/ansible
python version = 3.11.4 (main, Jun 20 2023, 16:59:59) [Clang 14.0.3 (clang-1403.0.22.14.1)]
jinja version = 3.1.2
libyaml = True
# /Users/quatrava/Dev/ops/xaasible-ops/ansible-deps-cache/python-libs/lib/python/site-packages/ansible_collections
Collection Version
----------------------------- -------
amazon.aws 3.4.0
ansible.netcommon 3.1.0
ansible.posix 1.4.0
ansible.utils 2.6.1
ansible.windows 1.11.0
arista.eos 5.0.1
awx.awx 21.4.0
azure.azcollection 1.13.0
check_point.mgmt 2.3.0
chocolatey.chocolatey 1.3.0
cisco.aci 2.2.0
cisco.asa 3.1.0
cisco.dnac 6.5.3
cisco.intersight 1.0.19
cisco.ios 3.3.0
cisco.iosxr 3.3.0
cisco.ise 2.5.0
cisco.meraki 2.10.1
cisco.mso 2.0.0
cisco.nso 1.0.3
cisco.nxos 3.1.0
cisco.ucs 1.8.0
cloud.common 2.1.2
cloudscale_ch.cloud 2.2.2
community.aws 3.5.0
community.azure 1.1.0
community.ciscosmb 1.0.5
community.crypto 2.5.0
community.digitalocean 1.21.0
community.dns 2.3.1
community.docker 2.7.1
community.fortios 1.0.0
community.general 5.5.0
community.google 1.0.0
community.grafana 1.5.2
community.hashi_vault 3.2.0
community.hrobot 1.5.2
community.libvirt 1.2.0
community.mongodb 1.4.2
community.mysql 3.4.0
community.network 4.0.1
community.okd 2.2.0
community.postgresql 2.2.0
community.proxysql 1.4.0
community.rabbitmq 1.2.2
community.routeros 2.2.1
community.sap 1.0.0
community.sap_libs 1.2.0
community.skydive 1.0.0
community.sops 1.3.0
community.vmware 2.8.0
community.windows 1.11.0
community.zabbix 1.8.0
containers.podman 1.9.4
cyberark.conjur 1.1.0
cyberark.pas 1.0.14
dellemc.enterprise_sonic 1.1.1
dellemc.openmanage 5.5.0
dellemc.os10 1.1.1
dellemc.os6 1.0.7
dellemc.os9 1.0.4
f5networks.f5_modules 1.19.0
fortinet.fortimanager 2.1.5
fortinet.fortios 2.1.7
frr.frr 2.0.0
gluster.gluster 1.0.2
google.cloud 1.0.2
hetzner.hcloud 1.8.1
hpe.nimble 1.1.4
ibm.qradar 2.0.0
ibm.spectrum_virtualize 1.9.0
infinidat.infinibox 1.3.3
infoblox.nios_modules 1.3.0
inspur.sm 2.0.0
junipernetworks.junos 3.1.0
kubernetes.core 2.3.2
mellanox.onyx 1.0.0
netapp.aws 21.7.0
netapp.azure 21.10.0
netapp.cloudmanager 21.19.0
netapp.elementsw 21.7.0
netapp.ontap 21.22.0
netapp.storagegrid 21.10.0
netapp.um_info 21.8.0
netapp_eseries.santricity 1.3.1
netbox.netbox 3.7.1
ngine_io.cloudstack 2.2.4
ngine_io.exoscale 1.0.0
ngine_io.vultr 1.1.2
openstack.cloud 1.8.0
openvswitch.openvswitch 2.1.0
ovirt.ovirt 2.2.3
purestorage.flasharray 1.13.0
purestorage.flashblade 1.9.0
purestorage.fusion 1.0.2
sensu.sensu_go 1.13.1
servicenow.servicenow 1.0.6
splunk.es 2.0.0
t_systems_mms.icinga_director 1.31.0
theforeman.foreman 3.4.0
vmware.vmware_rest 2.2.0
vyos.vyos 3.0.1
wti.remote 1.0.4
# /Users/quatrava/Dev/ops/xaasible-ops/ansible-deps-cache/ansible_collections
Collection Version
------------------ -------
ansible.posix 1.5.4
cloud.common 2.1.4
kubernetes.core 2.4.0
vmware.vmware_rest 2.3.1
aiohttp
is installed within the PYTHONPATH, but not globallycloud.common.plugins.module_utils.turbo
, such as vmware.vmware_rest.content_locallibrary
The task should succeed, or complain about bad parameters, or SSL verification for my phony setup, or something.
fatal: [vsphere-test]: FAILED! => {"changed": false, "msg": "Failed to import the required Python library (aiohttp) on SCXMACQUATRAVAUXD's Python /usr/local/Cellar/[email protected]/3.11.4_1/bin/python3.11. Please read the module documentation and install it in the appropriate location. If the required library is installed, but Ansible is using the wrong Python interpreter, please consult the documentation on ansible_python_interpreter"}
I tried using the turbo_demo module in a playbook that runs on localhost using connection: local
, but it gives me errors.
turbo_demo
$ ansible --version
ansible 2.10.1
config file = /etc/ansible/ansible.cfg
configured module search path = ['/Users/jgeerling/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python3.8/site-packages/ansible
executable location = /usr/local/bin/ansible
python version = 3.8.5 (default, Jul 21 2020, 10:48:26) [Clang 11.0.3 (clang-1103.0.32.62)]
ANSIBLE_NOCOWS(/etc/ansible/ansible.cfg) = True
ANSIBLE_PIPELINING(/etc/ansible/ansible.cfg) = True
ANSIBLE_SSH_CONTROL_PATH(/etc/ansible/ansible.cfg) = /tmp/ansible-ssh-%%h-%%p-%%r
DEFAULT_FORKS(/etc/ansible/ansible.cfg) = 20
DEFAULT_HOST_LIST(/etc/ansible/ansible.cfg) = ['/etc/ansible/hosts']
DEFAULT_ROLES_PATH(/etc/ansible/ansible.cfg) = ['/Users/jgeerling/Dropbox/VMs/roles']
RETRY_FILES_ENABLED(/etc/ansible/ansible.cfg) = False
macOS Catalina
---
- hosts: localhost
connection: local
gather_facts: false
tasks:
- cloud.common.turbo_demo:
with_sequence: count=10
Run the playbook with ansible-playbook main.yml
.
The demo should succeed.
For each of the 10 items in sequence, I get:
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: FileNotFoundError: [Errno 2] No such file or directory
failed: [localhost] (item=10) => {"ansible_loop_var": "item", "changed": false, "item": "10", "module_stderr": "Traceback (most recent call last):
File "/Users/jgeerling/.ansible/tmp/ansible-tmp-1602166944.607778-90470-33953846785094/AnsiballZ_turbo_demo.py", line 102, in <module>
_ansiballz_main()
File "/Users/jgeerling/.ansible/tmp/ansible-tmp-1602166944.607778-90470-33953846785094/AnsiballZ_turbo_demo.py", line 94, in _ansiballz_main
invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)
File "/Users/jgeerling/.ansible/tmp/ansible-tmp-1602166944.607778-90470-33953846785094/AnsiballZ_turbo_demo.py", line 40, in invoke_module
runpy.run_module(mod_name='ansible_collections.cloud.common.plugins.modules.turbo_demo', init_globals=None, run_name='__main__', alter_sys=True)
File "/usr/local/Cellar/[email protected]/3.8.5/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py", line 207, in run_module
return _run_module_code(code, init_globals, run_name, mod_spec)
File "/usr/local/Cellar/[email protected]/3.8.5/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py", line 97, in _run_module_code
_run_code(code, mod_globals, init_globals,
File "/usr/local/Cellar/[email protected]/3.8.5/Frameworks/Python.framework/Versions/3.8/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/var/folders/mt/rp7cb3s95qzfkrh1dk_8ckfc0000gn/T/ansible_cloud.common.turbo_demo_payload_cikop0vi/ansible_cloud.common.turbo_demo_payload.zip/ansible_collections/cloud/common/plugins/modules/turbo_demo.py", line 69, in <module>
File "/var/folders/mt/rp7cb3s95qzfkrh1dk_8ckfc0000gn/T/ansible_cloud.common.turbo_demo_payload_cikop0vi/ansible_cloud.common.turbo_demo_payload.zip/ansible_collections/cloud/common/plugins/modules/turbo_demo.py", line 65, in main
File "/var/folders/mt/rp7cb3s95qzfkrh1dk_8ckfc0000gn/T/ansible_cloud.common.turbo_demo_payload_cikop0vi/ansible_cloud.common.turbo_demo_payload.zip/ansible_collections/cloud/common/plugins/modules/turbo_demo.py", line 56, in run_module
File "/var/folders/mt/rp7cb3s95qzfkrh1dk_8ckfc0000gn/T/ansible_cloud.common.turbo_demo_payload_cikop0vi/ansible_cloud.common.turbo_demo_payload.zip/ansible_collections/cloud/common/plugins/module_utils/turbo/module.py", line 28, in __init__
File "/var/folders/mt/rp7cb3s95qzfkrh1dk_8ckfc0000gn/T/ansible_cloud.common.turbo_demo_payload_cikop0vi/ansible_cloud.common.turbo_demo_payload.zip/ansible_collections/cloud/common/plugins/module_utils/turbo/module.py", line 70, in run_on_daemon
File "/var/folders/mt/rp7cb3s95qzfkrh1dk_8ckfc0000gn/T/ansible_cloud.common.turbo_demo_payload_cikop0vi/ansible_cloud.common.turbo_demo_payload.zip/ansible_collections/cloud/common/plugins/module_utils/turbo/module.py", line 61, in connect
FileNotFoundError: [Errno 2] No such file or directory
", "module_stdout": "", "msg": "MODULE FAILURE
See stdout/stderr for the exact error", "rc": 1}
The latest release of cloud.common broke the behavior of parameters with aliases. #76 is the culprit. The problem is that the parameter with an alias just gets removed.
For example, ssl_ca_cert
is an alias of ca_cert
in k8s:
https://github.com/ansible-collections/kubernetes.core/blob/d6c06a20784fdaabd9e8affb902b3ebde0c482a7/plugins/module_utils/args_common.py#L37-L40
If a user sets ca_cert
in their task definition, it never gets added to the params that are passed to the turbo daemon for the module.
I would think this test case should pass, but it does not:
def test_args_aliases():
argspec = {
"foo": {"aliases": ["bar"], "type": int}
}
params = {"foo": 1}
assert prepare_args(argspec, params) == {"ANSIBLE_MODULE_ARGS": {"foo": 1}}
Tested on both 2.11 and milestone.
The following task should fail, but it does not:
- kubernetes.core.k8s:
kind: Namespace
name: foobar
ca_cert: /dev/null
validate_certs: yes
Relates to ansible-collections/news-for-maintainers#24.
It's been detected that this repository contains ignore-2.14.txt
file but not ignore-2.15.txt
file.
Actions needed:
tests/sanity/ignore-2.14.txt
to tests/sanity/ignore-2.15.txt
otherwise the CI will get failing.stable-2.15
branch to your CI workflow files. If you use GitHub actions, see the content of .github/workflows
or, if you use Azure Pipelines, community.postgresql/.azure-pipelines/azure-pipelines.yml
. If not added, this will violate the Collection Requirements for collections included in the Ansible package.To read more about the topic, see the Ignores guide.
To read more about the context, see the announcement.
Thank you!
Need the ability to override the default ttl for the daemon process to preserve the client session for specific period of time longer than the 15s default or until the playbook finishes executing.
What's the best way to do that currently?
ansible_collections.cloud.common.plugins.module_utils.turbo.server
ansible_collections.cloud.common.plugins.module_utils.turbo.module
Can an implementation like this be supported?
class AnsibleTurboModule(ansible.module_utils.basic.AnsibleModule):
def __init__(self, *args, **kwargs):
super().__init__(*args, **kwargs)
...
self.ttl = kwargs.get("ttl", 15)
- name: "Establish config session and send commands to candidate mode"
my_cutom_module_router_config:
router_ip: "{{ r_ip] }}"
username: "{{ r_user }}"
password: "{{ r_pass }}"
operations:
- config
config: "{{ cli_command_set }}"
- Name: "Do other stuff..."
pause:
minutes: 15
- name: "Must use previously created config session to commit config to running datastore"
my_cutom_module_router_config:
router_ip: "{{ r_ip] }}"
username: "{{ r_user }}"
password: "{{ r_pass }}"
operations:
- commit
config: "{{ cli_command_set }}"
We would like to use this with k8s inventory plugin and with a custom k8s inventory plugin (kubectl connection). I'm not sure if it will work out of the box. If not, any pointer to how to make it work would be appreciated.
inventory, connection
In operator SDK and Kubernetes environment
N/A
When turbo is enable, a module parameter with aliases is not recognized and its default value is used instead. However, its last alias is. For example: k8s_info paramater api_version
module
ansible 2.9.23
config file = None
configured module search path = ['/var/home/job/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /var/home/job/.local/lib/python3.9/site-packages/ansible
executable location = /var/home/job/.local/bin/ansible
python version = 3.9.7 (default, Aug 30 2021, 00:00:00) [GCC 11.2.1 20210728 (Red Hat 11.2.1-1)]
Cloud.common v2.1.0
N/A
Fedora 34
ansible-playbook play.yaml
- name: test alias
hosts: localhost
gather_facts: false
collections:
- kubernetes.core
environment:
ENABLE_TURBO_MODE: 1
tasks:
- name: create obj with version different than v1 (default)
k8s:
state: present
definition:
apiVersion: batch/v1beta1
kind: CronJob
metadata:
name: deleteme
namespace: default
spec:
schedule: "*/1 * * * *"
jobTemplate:
spec:
template:
spec:
containers:
- name: hello
image: busybox
imagePullPolicy: IfNotPresent
command:
- /bin/sh
- -c
- date; echo Hello from the Kubernetes cluster
restartPolicy: OnFailure
- name: THIS FAIL (No alias), get obj
retries: 1
delay: 1
ignore_errors: true
until:
- (wait_failed_task.resources | first).status is defined
k8s_info:
api_version: batch/v1beta1
kind: CronJob
namespace: default
name: deleteme
register: wait_failed_task
- name: get obj
retries: 1
delay: 1
until:
- (wait_ok_task.resources | first).status is defined
k8s_info:
# api_version: batch/v1beta1
api: batch/v1beta1 # alias of api_version
kind: CronJob
namespace: default
name: deleteme
register: wait_ok_task
- name: THIS FAIL (not the last alias?), delete obj
ignore_errors: true
k8s:
# api_version: batch/v1beta1
version: batch/v1beta1 # alias of api_version
kind: CronJob
namespace: default
name: deleteme
state: absent
- name: delete obj
k8s:
# api_version: batch/v1beta1
api: batch/v1beta1 # alias of api_version
kind: CronJob
namespace: default
name: deleteme
state: absent
Api version is given the value set by api_version
Api version gets default value ("v1") even though we are setting it to a different one.
[WARNING]: No inventory was parsed, only implicit localhost is available
[WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all'
PLAY [test alias] **********************************************************************************************************************
TASK [create obj with version different than v1 (default)] *****************************************************************************
changed: [localhost]
TASK [THIS FAIL (No alias), get obj] ***************************************************************************************************
FAILED - RETRYING: THIS FAIL (No alias), get obj (1 retries left).
fatal: [localhost]: FAILED! => {"api_found": false, "attempts": 1, "changed": false, "msg": "Failed to find API for resource with apiVer
sion \"v1\" and kind \"CronJob\"", "resources": []}
...ignoring
TASK [get obj] *************************************************************************************************************************
ok: [localhost]
TASK [THIS FAIL (not the last alias?), delete obj] *************************************************************************************
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Failed to find exact match for v1.CronJob by [kind, name, singularName, shortN
ames]"}
...ignoring
TASK [delete obj] **********************************************************************************************************************
changed: [localhost]
PLAY RECAP *****************************************************************************************************************************
localhost : ok=5 changed=2 unreachable=0 failed=0 skipped=0 rescued=0 ignored=2
Turbo server processes are not killed after 15 seconds as mentioned in the documentation
ansible_collections.cloud.common.plugins.module_utils.turbo.server
ansible [core 2.13.11]
config file = /opt/ansible-playbooks-commit-2/ansible.cfg
configured module search path = ['/home/ansible/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules']
ansible python module location = /opt/ansible-playbooks-commit-2/.venv/lib/python3.8/site-packages/ansible
ansible collection location = /opt/ansible-playbooks-commit-2/collections:/home/ansible/.ansible/collections:/usr/share/ansible/collections
executable location = /opt/ansible-playbooks-commit-2/.venv/bin/ansible
python version = 3.8.13 (default, Mar 28 2022, 09:27:29) [GCC 8.3.0]
jinja version = 3.1.2
libyaml = True
$ ansible-galaxy collection list community.general
# /opt/ansible-playbooks-commit-2/collections/ansible_collections
Collection Version
----------------- -------
community.general 7.4.0
$ ansible-galaxy collection list cloud.common
# /opt/ansible-playbooks-commit-2/collections/ansible_collections
Collection Version
------------ -------
cloud.common 2.1.4
$ ansible-galaxy collection list vmware.vmware_rest
# /opt/ansible-playbooks-commit-2/collections/ansible_collections
Collection Version
------------------ -------
vmware.vmware_rest 2.3.1
AGNOSTIC_BECOME_PROMPT(/opt/ansible-playbooks-commit-2/ansible.cfg) = False
CALLBACKS_ENABLED(/opt/ansible-playbooks-commit-2/ansible.cfg) = ['timer', 'ansible.posix.profile_tasks']
COLLECTIONS_PATHS(/opt/ansible-playbooks-commit-2/ansible.cfg) = ['/opt/ansible-playbooks-commit-2/collections', '/home/ansible/.ansible/collections', '/usr/share/ansible/collections']
DEFAULT_FORKS(/opt/ansible-playbooks-commit-2/ansible.cfg) = 50
DEFAULT_GATHERING(/opt/ansible-playbooks-commit-2/ansible.cfg) = explicit
DEFAULT_TIMEOUT(/opt/ansible-playbooks-commit-2/ansible.cfg) = 15
DEFAULT_VAULT_PASSWORD_FILE(/opt/ansible-playbooks-commit-2/ansible.cfg) = /opt/ansible-playbooks-commit-2/inventories/vault_password.sh
DISPLAY_SKIPPED_HOSTS(/opt/ansible-playbooks-commit-2/ansible.cfg) = False
HOST_KEY_CHECKING(/opt/ansible-playbooks-commit-2/ansible.cfg) = False
INVENTORY_ANY_UNPARSED_IS_FAILED(/opt/ansible-playbooks-commit-2/ansible.cfg) = True
INVENTORY_ENABLED(/opt/ansible-playbooks-commit-2/ansible.cfg) = ['host_list', 'ini', 'constructed']
INVENTORY_IGNORE_EXTS(/opt/ansible-playbooks-commit-2/ansible.cfg) = ['~', '.orig', '.bak', '.cfg', '.retry', '.pyc', '.pyo', 'LICENSE', '.md', '.txt', 'secrets.yml', 'vars.yml', 'ssh_private_key']
RETRY_FILES_ENABLED(/opt/ansible-playbooks-commit-2/ansible.cfg) = True
RETRY_FILES_SAVE_PATH(/opt/ansible-playbooks-commit-2/ansible.cfg) = /tmp
TASK_TIMEOUT(/opt/ansible-playbooks-commit-2/ansible.cfg) = 180
VARIABLE_PRECEDENCE(/opt/ansible-playbooks-commit-2/ansible.cfg) = ['all_inventory', 'groups_inventory', 'all_plugins_play', 'groups_plugins_play', 'all_plugins_inventory', 'groups_plugins_inventory']
OS: debian 10
Kernel: Linux infra-ansible 4.19.0-25-amd64 #1 SMP Debian 4.19.289-2 (2023-08-08) x86_64 GNU/Linux
Using the vmware.vmware_rest.vcenter_vm_info
module spawns a ansible_collections.cloud.common.plugins.module_utils.turbo.server
process.
Wait more than the default TTL (15 seconds) and run ps -edf | grep ansible_collections.cloud.common.plugins.module_utils.turbo.server
you'll see that the process is never stopped.
Processes should be automatically stopped after the default 15 seconds timeout as mentioned in the documentation.
Processes are never killed leading to memory leak.
$ date
Wed 08 Nov 2023 12:32:47 PM CET
$ ps -edf | grep ansible_collections.cloud.common.plugins.module_utils.turbo.server
ansible 53861 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 53870 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 53873 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 53886 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 54028 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 54032 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 54036 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 54041 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 54046 1 0 12:02 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78132 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78134 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78560 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78570 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78573 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78727 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78731 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78735 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 78928 1 0 12:29 ? 00:00:00 /opt/ansible-playbooks-commit/.venv/bin/python -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 84151 71329 0 12:32 pts/4 00:00:00 grep ansible_collections.cloud.common.plugins.module_utils.turbo.server
ansible 100022 1 0 07:06 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102366 1 0 07:09 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102378 1 0 07:09 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102387 1 0 07:09 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102391 1 0 07:09 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102576 1 0 07:10 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102583 1 0 07:10 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102587 1 0 07:10 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102590 1 0 07:10 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
ansible 102673 1 0 07:10 ? 00:00:00 /opt/ansible-playbooks-commit-2/.venv/bin/python3.8 -m ansible_collections.cloud.common.plugins.module_utils.turbo.server --fork --socket-path /dev/shm/ansible/.ansible/tmp/turbo_mode.vmware.vmware_rest.socket
AnsibleTurboModule
class implements fail_json
like this
cloud.common/plugins/module_utils/turbo/module.py
Lines 174 to 179 in e620e72
cloud.common/plugins/module_utils/turbo/exceptions.py
Lines 1 to 5 in e620e72
The exception raised with be converted into one single message.
Assuming you are trying to fail with the following information
{'changed': True, 'result': {...}, 'duration': 10, 'method': 'create'} and msg="Resource creation timed out"
The task output will be similar to this
fatal: [localhost]: FAILED! => {"changed": false, "msg": "Resource creation timed out{'changed': True, 'result':{...}, 'duration': 10, 'method': 'create'}"}
fatal: [localhost]: FAILED! => {"changed": true, "msg": "Resource creation timed out", "duration": 10, "method": "create"}
The EmbeddedModuleFailure
is basically concatenating the msg
with the kwargs
(convert into a string) arguments to build the exception message
AnsibleTurboModule
Turbo mode doesn't read in the entire module parameter data leading to an exception loading the JSON data. The problem is here:
I'd guess it will need to keep reading until the stream is done. This is particularly a problem for the kubernetes collection since the module params include resource definitions that can potentially be lengthy.See ansible-collections/kubernetes.core#140 for a full example on reproducing.
server.py
We are happy to announce that the registration for the Ansible Contributor Summit is open!
This is a great opportunity for interested people to meet, discuss related topics, share their stories and opinions, get the latest important updates and just to hang out together.
There will be different announcements & presentations by Community, Core, Cloud, Network, and other teams.
Current contributors will be happy to share their stories and experience with newcomers.
There will be links to interactive self-passed instruqt scenarios shared during the event that help newcomers learn different aspects of development.
Online on Matrix and Youtube. Tuesday, April 12, 2022, 12:00 - 20:00 UTC.
Add the event to your calendar. Use the ical URL (for example, in Google Calendar "Add other calendars" > "Import from URL") instead of importing the .ics file so that any updates to the event will be reflected in your calendar.
Check out the Summit page:
We are looking forward to seeing you!:)
ansible-collections/vmware.vmware_rest#231 hardcodes a higher timeout. It would be much more elegant to expose a key to allow the user to set a specific timeout.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.