cohesity / cohesity-ansible-role Goto Github PK
View Code? Open in Web Editor NEWThis repository provides an Ansible role and related modules for Cohesity DataPlatform.
License: Apache License 2.0
This repository provides an Ansible role and related modules for Cohesity DataPlatform.
License: Apache License 2.0
(A clear and concise description of what the bug is)
The oracle_job can take start_time as an input for setting the job's custom start time. The Python code library/cohesity_oracle_job.py expects the parameter start_time as part of the input params from the oracle_job task. As the argument start_time is not specified in the tasks/oracle_job.yml , the python code library/cohesity_oracle_job.py sets the default value of the start_time as null always. Can you please fix the tasks/oracle_job.yml to include start_time as an argument.
(Write your steps here:)
(Write what you thought would happen.)
(Write what happened. Add screenshots, if applicable.)
Hi Cohesity,
in the examples section of the sorce module:
https://cohesity.github.io/cohesity-ansible-role/#/modules/cohesity_source
the parameter "server" is mentioned several times:
- cohesity_source:
server: cohesity.lab
username: admin
password: password
endpoint: mylinux.host.lab
state: present
Thanks, Frank
The "excludeFilePaths" variable in the Ansible role is not behaving as expected. The problems are:
When a Cohesity protection job is defined via Ansible WITHOUT "excludeFilePaths" set the protection job is succesfully created. Immediate & subsequent runs of the exact same playbook fail with error until "excludeFilePaths" is defined.
When a Cohesity protection job is defined via Ansible WITH "excludeFilePaths" set the protection job is succesfully created. If you undefine "excludeFilePaths" later the playbook fails with error unless at least ONE "excludeFilePaths" is defined per included path.
Subsequent runs of the playbook should not error. They do unless "excludeFilePaths" is defined.
An exception occurred during task execution. To see the full traceback, use -vvv. The error was: KeyError: 'excludeFilePaths'
fatal: [127.0.0.1]: FAILED! => {"changed": false, "module_stderr": "Traceback (most recent call last):\n File "/home/user/.ansible/tmp/ansible-tmp-1610495163.82-10786-279261066461986/AnsiballZ_cohesity_job.py", line 102, in \n _ansiballz_main()\n File "/home/user/.ansible/tmp/ansible-tmp-1610495163.82-10786-279261066461986/AnsiballZ_cohesity_job.py", line 94, in _ansiballz_main\n invoke_module(zipped_mod, temp_path, ANSIBALLZ_PARAMS)\n File "/home/user/.ansible/tmp/ansible-tmp-1610495163.82-10786-279261066461986/AnsiballZ_cohesity_job.py", line 40, in invoke_module\n runpy.run_module(mod_name='ansible.modules.cohesity_job', init_globals=None, run_name='main', alter_sys=True)\n File "/usr/lib64/python2.7/runpy.py", line 176, in run_module\n fname, loader, pkg_name)\n File "/usr/lib64/python2.7/runpy.py", line 82, in _run_module_code\n mod_name, mod_fname, mod_loader, pkg_name)\n File "/usr/lib64/python2.7/runpy.py", line 72, in _run_code\n exec code in run_globals\n File "/tmp/ansible_cohesity_job_payload_pnR1_H/ansible_cohesity_job_payload.zip/ansible/modules/cohesity_job.py", line 1150, in \n File "/tmp/ansible_cohesity_job_payload_pnR1_H/ansible_cohesity_job_payload.zip/ansible/modules/cohesity_job.py", line 951, in main\n File "/tmp/ansible_cohesity_job_payload_pnR1_H/ansible_cohesity_job_payload.zip/ansible/modules/cohesity_job.py", line 820, in update_job_util\nKeyError: 'excludeFilePaths'\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
I see currently RHEL/Ubuntu/Windows is supported.
Do you plan to support Solaris 11?
https://github.com/cohesity/cohesity-ansible-role/blob/master/docs/README.md
Hi Cohesity!
I'm just beginnig to test your ansible role and running into problems during the agent installation:
...
2019-01-18 09:45:44,848 p=20254 u=kortstie | changed: [xxx]
2019-01-18 09:45:44,918 p=20254 u=kortstie | TASK [Uninstall Cohesity Agent from each Linux Server] ****************************************************************************************************************************************
2019-01-18 09:45:45,185 p=20254 u=kortstie | TASK [cohesity.ansible : Install Prerequisite Packages for CentOS] **************************************************************************************************************************************************************************
2019-01-18 09:45:45,301 p=20254 u=kortstie | skipping: [xxx]
2019-01-18 09:45:45,368 p=20254 u=kortstie | TASK [cohesity.ansible : Install Prerequisite Packages for Ubuntu] **************************************************************************************************************************************************************************
2019-01-18 09:45:45,451 p=20254 u=kortstie | skipping: [xxx]
2019-01-18 09:45:45,531 p=20254 u=kortstie | TASK [cohesity.ansible : Cohesity agent: Set Agent to state of absent] **********************************************************************************************************************************************************************
2019-01-18 09:45:52,761 p=20254 u=kortstie | changed: [xxx]
2019-01-18 09:45:52,829 p=20254 u=kortstie | TASK [Install new Cohesity Agent on each Linux Server] **************************************************************************************************************************************************************************************
2019-01-18 09:45:53,035 p=20254 u=kortstie | TASK [cohesity.ansible : Install Prerequisite Packages for CentOS] **************************************************************************************************************************************************************************
2019-01-18 09:45:53,122 p=20254 u=kortstie | skipping: [xxx]
2019-01-18 09:45:53,186 p=20254 u=kortstie | TASK [cohesity.ansible : Install Prerequisite Packages for Ubuntu] **************************************************************************************************************************************************************************
2019-01-18 09:45:53,266 p=20254 u=kortstie | skipping: [xxx]
2019-01-18 09:45:53,335 p=20254 u=kortstie | TASK [cohesity.ansible : Cohesity agent: Set Agent to state of present] *********************************************************************************************************************************************************************
2019-01-18 09:45:53,810 p=20254 u=kortstie | fatal: [xxx]: FAILED! => {"Failed": true, "changed": false, "check_agent": {"stderr": "kill: sending signal to 21526 failed: No such process\n", "stdout": ""}, "msg": "Failed to remove an orphaned Cohesity Agent service which is still running", "process_id": "21526", "state": "present", "version": false}
2019-01-18 09:45:53,858 p=20254 u=kortstie | to retry, use: --limit @/opt/ansible/roles/cohesity/cohesity.retry
I checked the system, but no cohesity processes were running and there was no pid 21526...
What is going wrong here?
Best regards from germany
Frank
Hi Cohesity!
an installed and running Cohesity Agent Version 6.0.0.a is not upgraded when the role cohesity_agent is running on the host.
It would be helpful if the role detects and upgrades an older installed version of the client.
Best Regards
Frank
Hey Folks,
A couple of questions
Appreciate you help.
Thanks
ec
If it's an urgent request, Please contact us instead.
Hello,
Do you know when Nutanix acropolis will be supported in this module ?
my infrascructure will move on full nutanix and this module can but really good for me.
Regards,
Fabien.
In my scenario I have a protection job named TEST_JOB_JT that has 5 vm's protected. I would like to use this ansible module to a 6th server named "my-new-server" to the job so I tried this:
- name: Add server to protection job
cohesity_job:
cluster: "{{ cohesity_cluster }}"
username: "{{ cohesity_api_user }}"
password: "{{ cohesity_api_pass }}"
validate_certs: false
state: present
name: TEST_JOB_JT
environment: VMware
include:
- my-new-server
register: cohesity_protection
This unfortunately changes the job to only have "my-new-server" in the job and the original 5 servers are removed. Is this possible or do I need to list all servers in the job under the "include:" section? If so, how do you get the server names in a job? In the cohesity_facts module for jobs it lists them as "sourceIds" with numeric id numbers instead of vm names.
Thanks,
Josh
With existing vCenter sources registered, I would like to be able to update the vCenter username and or password with new current credentials with a Cohesity ansible role or task
TASK [cohesity.cohesity_ansible_role : Cohesity agent: Set Agent to state of present] *****************************************************************************************************
task path: /home/adminlocal/cohesity-agent.playbook/.roles/cohesity.cohesity_ansible_role/tasks/agent.yml:54
<10.54.2.11> ESTABLISH SSH CONNECTION FOR USER: matthew.williams
<10.54.2.11> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="matthew.williams"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/adminlocal/.ansible/cp/9d1db22039 10.54.2.11 '/bin/sh -c '"'"'echo ~matthew.williams && sleep 0'"'"''
<10.54.2.11> (0, b'/home/matthew.williams\n', b'')
<10.54.2.11> ESTABLISH SSH CONNECTION FOR USER: matthew.williams
<10.54.2.11> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="matthew.williams"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/adminlocal/.ansible/cp/9d1db22039 10.54.2.11 '/bin/sh -c '"'"'( umask 77 && mkdir -p "` echo /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162 `" && echo ansible-tmp-1584050465.9643524-67940419594162="` echo /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162 `" ) && sleep 0'"'"''
<10.54.2.11> (0, b'ansible-tmp-1584050465.9643524-67940419594162=/home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162\n', b'')
Using module file /home/adminlocal/cohesity-agent.playbook/.roles/cohesity.cohesity_ansible_role/library/cohesity_agent.py
<10.54.2.11> PUT /home/adminlocal/.ansible/tmp/ansible-local-44424bnwzbh71/tmp1aqn1yee TO /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162/AnsiballZ_cohesity_agent.py
<10.54.2.11> SSH: EXEC sftp -b - -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="matthew.williams"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/adminlocal/.ansible/cp/9d1db22039 '[10.54.2.11]'
<10.54.2.11> (0, b'sftp> put /home/adminlocal/.ansible/tmp/ansible-local-44424bnwzbh71/tmp1aqn1yee /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162/AnsiballZ_cohesity_agent.py\n', b'')
<10.54.2.11> ESTABLISH SSH CONNECTION FOR USER: matthew.williams
<10.54.2.11> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="matthew.williams"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/adminlocal/.ansible/cp/9d1db22039 10.54.2.11 '/bin/sh -c '"'"'chmod u+x /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162/ /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162/AnsiballZ_cohesity_agent.py && sleep 0'"'"''
<10.54.2.11> (0, b'', b'')
<10.54.2.11> ESTABLISH SSH CONNECTION FOR USER: matthew.williams
<10.54.2.11> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="matthew.williams"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/adminlocal/.ansible/cp/9d1db22039 -tt 10.54.2.11 '/bin/sh -c '"'"'sudo -H -S -n -u root /bin/sh -c '"'"'"'"'"'"'"'"'echo BECOME-SUCCESS-pqtrkixaiyulmlwoiotousljiadztaou ; /usr/bin/python3 /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162/AnsiballZ_cohesity_agent.py'"'"'"'"'"'"'"'"' && sleep 0'"'"''
Escalation succeeded
<10.54.2.11> (1, b'\r\n{"msg": "Unexpected error caused while managing the Cohesity Module.", "error_details": "\'HTTPMessage\' object has no attribute \'dict\'", "error_class": "AttributeError", "failed": true, "exception": " File \\"/tmp/ansible_cohesity_agent_payload_vl_pidi7/__main__.py\\", line 262, in download_agent\\n resp_headers = agent.info().dict\\n", "invocation": {"module_args": {"cluster": "dk-bl-cohesity.secmet.co", "username": "cohesitybackups", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "validate_certs": false, "state": "present", "service_user": "cohesityagent", "service_group": "cohesityagent", "create_user": true, "download_location": "", "native_package": false, "download_uri": "", "operating_system": "Ubuntu", "file_based": false}}}\r\n', b'Shared connection to 10.54.2.11 closed.\r\n')
<10.54.2.11> Failed to connect to the host via ssh: Shared connection to 10.54.2.11 closed.
<10.54.2.11> ESTABLISH SSH CONNECTION FOR USER: matthew.williams
<10.54.2.11> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o 'User="matthew.williams"' -o ConnectTimeout=10 -o StrictHostKeyChecking=no -o ControlPath=/home/adminlocal/.ansible/cp/9d1db22039 10.54.2.11 '/bin/sh -c '"'"'rm -f -r /home/matthew.williams/.ansible/tmp/ansible-tmp-1584050465.9643524-67940419594162/ > /dev/null 2>&1 && sleep 0'"'"''
<10.54.2.11> (0, b'', b'')
The full traceback is:
File "/tmp/ansible_cohesity_agent_payload_vl_pidi7/__main__.py", line 262, in download_agent
resp_headers = agent.info().dict
fatal: [hq-it-phv-tools-1]: FAILED! => {
"changed": false,
"error_class": "AttributeError",
"error_details": "'HTTPMessage' object has no attribute 'dict'",
"invocation": {
"module_args": {
"cluster": "dk-bl-cohesity.secmet.co",
"create_user": true,
"download_location": "",
"download_uri": "",
"file_based": false,
"native_package": false,
"operating_system": "Ubuntu",
"password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"service_group": "cohesityagent",
"service_user": "cohesityagent",
"state": "present",
"username": "cohesitybackups",
"validate_certs": false
}
},
"msg": "Unexpected error caused while managing the Cohesity Module."
}
Just reporting that I am not seeing any resolution or a specific error in my implementation. This is running the playbook call from the installation instructions found here: https://cohesity.github.io/cohesity-ansible-role/#/how-to-use
I have tested all versions of the role, both numbered and master. Not sure what I am missing.
I am attempting to create backup jobs using the cohesity_job module to create new backup jobs and getting the error "Unexpected error caused while managing the Cohesity Module."
. with an error details message telling me to specify one or more sources to back up, even though endpoints are included in my playbook.
running with the -vvv
switch I see that the endpoint
is somehow being set to null
when the module is invoked, even though it is a simple string in my playbook. I did try adding additional items to the list, and they do appear, but all of the values for the endpoint
key are switched to null
.
errors
fatal: [localhost]: FAILED! => {
"changed": false,
"error_class": "bytes",
"error_details": "b'{\"errorCode\":\"KValidationError\",\"message\":\"Please specify one or more sources to backup.\"}\\n'",
"invocation": {
"module_args": {
"cancel_active": false,
"cluster": "10.181.224.10",
"delete_backups": false,
"description": null,
"environment": "VMware",
"name": "test",
"ondemand_run_type": "Regular",
"password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"protection_policy": "Gold",
"protection_sources": [
{
"endpoint": null
}
],
"start_time": null,
"state": "present",
"storage_domain": "DefaultStorageDomain",
"time_zone": "America/Los_Angeles",
"username": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER",
"validate_certs": false
}
},
"msg": "Unexpected error caused while managing the Cohesity Module."
}
playbook
---
- name: Create backup jobs
hosts: localhost
gather_facts: no
become: false
tasks:
- name: create new protection job
cohesity_job:
cluster: _cluster-vip_
username: _username_
password: _password_
state: present
name: test
environment: VMware
protection_sources:
- endpoint: vcsa01.skunkworkx.cloud
protection_policy: Gold
storage_domain: DefaultStorageDomain
When I call the cohesity_resotre_vm module, and leave the job_name parameter, the module exits with the following errror:
...
/tmp/ansible-tmp-1549545897.5-200667828612790/AnsiballZ_cohesity_restore_vm.py", line 48, in invoke_module\n imp.load_module('main', mod
, module, MOD_DESC)\n File "/tmp/ansible_cohesity_restore_vm_payload_61RW3z/main.py", line 627, in \n File "/tmp/ansible_cohesity_restore_vm_payload_61RW3z/main.py", line 478, in main\nTypeError: unsupported operand
type(s) for +: 'NoneType' and 'str'\n", "module_stdout": "", "msg": "MODULE FAILURE\nSee stdout/stderr for the exact error", "rc": 1}
The module should select the most recent snaphot from any existing protection job, if the job_name parameter is not given.
The Mail adress which is public posted under your feedback section seems to be wrong.
I get a mail delevery error from googlemail after I tried do send you an mail...
BR
Frank
I suggest to add a parameter "vm_names" to the cohesity_job module to limit the Virtual Machines managed by a protection job.
Example:
We have many webservers which can't reach the cohesity API directly, therefore I can't install the Agent with the ansible role.
In the past I wrote my own playbook which copied the agent installer to these systems and installs / upgrades it then locally on the webserver...
A function to manage such isolated hosts integrated in your role would be really helpful!
Hello,
I'd like to automate creating VM and then creating a protection job for the new VM.
However, newly registered VM to vCenter doesn't be seen unless source is refreshed manually.
So can I add a function to cohesity_source or cohesity_job module?
Which one is the best?
def refresh_source(module, self):
I get a error when i want to create a Protection Policy with archival.
Here we go this error :
fatal: [localhost]: FAILED! => {"changed": false, "error_class": "TypeError", "error_details": "must be str, not NoneType", "msg": "Unexpected error caused while managing the Cohesity Module."}
Here we go my create_policy.yml :
Anyone can help me to find the trouble ?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.