ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-qOV executable location = /usr/local/bin/ansible-playbook python version = 3.12.12 (main, Jan 8 2026, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-14)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_default.yml **************************************************** 1 plays in /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tests_default.yml PLAY [Ensure that the role runs with default parameters] *********************** TASK [fedora.linux_system_roles.cvm_deploy : Set platform/version specific variables] *** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:3 Friday 23 January 2026 20:24:21 -0500 (0:00:00.020) 0:00:00.020 ******** included: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.cvm_deploy : Ensure ansible_facts used by role] *** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:2 Friday 23 January 2026 20:24:21 -0500 (0:00:00.016) 0:00:00.037 ******** [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [fedora.linux_system_roles.cvm_deploy : Check if system is ostree] ******** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:10 Friday 23 January 2026 20:24:23 -0500 (0:00:01.965) 0:00:02.002 ******** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.cvm_deploy : Set flag to indicate system is ostree] *** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:15 Friday 23 January 2026 20:24:24 -0500 (0:00:00.456) 0:00:02.459 ******** ok: [managed-node1] => { "ansible_facts": { "__cvm_deploy_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.cvm_deploy : Set platform/version specific variables] *** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:19 Friday 23 January 2026 20:24:24 -0500 (0:00:00.020) 0:00:02.479 ******** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__cvm_deploy_packages": [], "__cvm_deploy_services": [] }, "ansible_included_var_files": [ "/tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "__cvm_deploy_packages": [], "__cvm_deploy_services": [] }, "ansible_included_var_files": [ "/tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.cvm_deploy : Ensure required packages are installed] *** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:7 Friday 23 January 2026 20:24:24 -0500 (0:00:00.039) 0:00:02.518 ******** ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do lsrpackages: TASK [fedora.linux_system_roles.cvm_deploy : Ensure required services are enabled and started] *** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:13 Friday 23 January 2026 20:24:36 -0500 (0:00:12.576) 0:00:15.095 ******** skipping: [managed-node1] => { "changed": false, "skipped_reason": "No items in the list" } TASK [fedora.linux_system_roles.cvm_deploy : Generate /etc/foo.conf] *********** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:20 Friday 23 January 2026 20:24:36 -0500 (0:00:00.013) 0:00:15.108 ******** Notification for handler Handler for cvm_deploy to restart services has been saved. changed: [managed-node1] => { "changed": true, "checksum": "52a44b94ed4361ce3726994d2dedbe99528ed1df", "dest": "/etc/foo.conf", "gid": 0, "group": "root", "md5sum": "e122058d2eda8c0d14f66f6e510a4390", "mode": "0400", "owner": "root", "secontext": "system_u:object_r:etc_t:s0", "size": 75, "src": "/root/.ansible/tmp/ansible-tmp-1769217877.018957-7775-270190277607266/.source.conf", "state": "file", "uid": 0 } TASK [Check header for ansible_managed, fingerprint] *************************** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tests_default.yml:9 Friday 23 January 2026 20:24:37 -0500 (0:00:00.796) 0:00:15.905 ******** included: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tasks/check_header.yml for managed-node1 TASK [Get file] **************************************************************** task path: /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tasks/check_header.yml:3 Friday 23 January 2026 20:24:37 -0500 (0:00:00.012) 0:00:15.917 ******** fatal: [managed-node1]: FAILED! => { "changed": false } MSG: file not found: /etc/cvm_deploy.conf PLAY RECAP ********************************************************************* managed-node1 : ok=8 changed=1 unreachable=0 failed=1 skipped=1 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2026-01-24T01:24:38.240687+00:00Z", "host": "managed-node1", "message": "file not found: /etc/cvm_deploy.conf", "start_time": "2026-01-24T01:24:37.786005+00:00Z", "task_name": "Get file", "task_path": "/tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tasks/check_header.yml:3" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Friday 23 January 2026 20:24:38 -0500 (0:00:00.456) 0:00:16.374 ******** =============================================================================== fedora.linux_system_roles.cvm_deploy : Ensure required packages are installed -- 12.58s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:7 fedora.linux_system_roles.cvm_deploy : Ensure ansible_facts used by role --- 1.97s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:2 fedora.linux_system_roles.cvm_deploy : Generate /etc/foo.conf ----------- 0.80s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:20 fedora.linux_system_roles.cvm_deploy : Check if system is ostree -------- 0.46s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:10 Get file ---------------------------------------------------------------- 0.46s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tasks/check_header.yml:3 fedora.linux_system_roles.cvm_deploy : Set platform/version specific variables --- 0.04s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:19 fedora.linux_system_roles.cvm_deploy : Set flag to indicate system is ostree --- 0.02s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/set_vars.yml:15 fedora.linux_system_roles.cvm_deploy : Set platform/version specific variables --- 0.02s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:3 fedora.linux_system_roles.cvm_deploy : Ensure required services are enabled and started --- 0.01s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/roles/cvm_deploy/tasks/main.yml:13 Check header for ansible_managed, fingerprint --------------------------- 0.01s /tmp/collections-qOV/ansible_collections/fedora/linux_system_roles/tests/cvm_deploy/tests_default.yml:9 Jan 23 20:24:21 managed-node1 sshd-session[7184]: Accepted publickey for root from 10.31.43.8 port 42964 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 23 20:24:21 managed-node1 systemd-logind[610]: New session 5 of user root. ░░ Subject: A new session 5 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 5 has been created for the user root. ░░ ░░ The leading process of the session is 7184. Jan 23 20:24:21 managed-node1 systemd[1]: Started Session 5 of User root. ░░ Subject: A start job for unit session-5.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-5.scope has finished successfully. ░░ ░░ The job identifier is 902. Jan 23 20:24:21 managed-node1 systemd[1]: systemd-hostnamed.service: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit systemd-hostnamed.service has successfully entered the 'dead' state. Jan 23 20:24:21 managed-node1 sshd-session[7184]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 23 20:24:21 managed-node1 sshd-session[7189]: Received disconnect from 10.31.43.8 port 42964:11: disconnected by user Jan 23 20:24:21 managed-node1 sshd-session[7189]: Disconnected from user root 10.31.43.8 port 42964 Jan 23 20:24:21 managed-node1 sshd-session[7184]: pam_unix(sshd:session): session closed for user root Jan 23 20:24:21 managed-node1 systemd[1]: session-5.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-5.scope has successfully entered the 'dead' state. Jan 23 20:24:21 managed-node1 systemd-logind[610]: Session 5 logged out. Waiting for processes to exit. Jan 23 20:24:21 managed-node1 systemd-logind[610]: Removed session 5. ░░ Subject: Session 5 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 5 has been terminated. Jan 23 20:24:21 managed-node1 sshd-session[7213]: Accepted publickey for root from 10.31.43.8 port 42968 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 23 20:24:21 managed-node1 systemd-logind[610]: New session 6 of user root. ░░ Subject: A new session 6 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 6 has been created for the user root. ░░ ░░ The leading process of the session is 7213. Jan 23 20:24:21 managed-node1 systemd[1]: Started Session 6 of User root. ░░ Subject: A start job for unit session-6.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-6.scope has finished successfully. ░░ ░░ The job identifier is 971. Jan 23 20:24:21 managed-node1 sshd-session[7213]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 23 20:24:21 managed-node1 sshd-session[7216]: Received disconnect from 10.31.43.8 port 42968:11: disconnected by user Jan 23 20:24:21 managed-node1 sshd-session[7216]: Disconnected from user root 10.31.43.8 port 42968 Jan 23 20:24:21 managed-node1 sshd-session[7213]: pam_unix(sshd:session): session closed for user root Jan 23 20:24:21 managed-node1 systemd[1]: session-6.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-6.scope has successfully entered the 'dead' state. Jan 23 20:24:21 managed-node1 systemd-logind[610]: Session 6 logged out. Waiting for processes to exit. Jan 23 20:24:21 managed-node1 systemd-logind[610]: Removed session 6. ░░ Subject: Session 6 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 6 has been terminated. Jan 23 20:24:22 managed-node1 sshd-session[7242]: Accepted publickey for root from 10.31.43.8 port 42978 ssh2: ECDSA SHA256:r49ag/ZI5l6VEBxDDOhJoK2/r9WU4JvUQcGBr65Zj1c Jan 23 20:24:22 managed-node1 systemd-logind[610]: New session 7 of user root. ░░ Subject: A new session 7 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 7 has been created for the user root. ░░ ░░ The leading process of the session is 7242. Jan 23 20:24:22 managed-node1 systemd[1]: Started Session 7 of User root. ░░ Subject: A start job for unit session-7.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-7.scope has finished successfully. ░░ ░░ The job identifier is 1040. Jan 23 20:24:22 managed-node1 sshd-session[7242]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 23 20:24:23 managed-node1 python3.9[7419]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Jan 23 20:24:24 managed-node1 python3.9[7570]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Jan 23 20:24:24 managed-node1 python3.9[7719]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d Jan 23 20:24:25 managed-node1 python3.9[7796]: ansible-ansible.legacy.dnf Invoked with name=[] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Jan 23 20:24:37 managed-node1 python3.9[7980]: ansible-ansible.legacy.stat Invoked with path=/etc/foo.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True Jan 23 20:24:37 managed-node1 python3.9[8100]: ansible-ansible.legacy.copy Invoked with src=/root/.ansible/tmp/ansible-tmp-1769217877.018957-7775-270190277607266/.source.conf dest=/etc/foo.conf backup=True mode=0400 follow=False _original_basename=foo.conf.j2 checksum=52a44b94ed4361ce3726994d2dedbe99528ed1df force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None Jan 23 20:24:38 managed-node1 python3.9[8249]: ansible-slurp Invoked with path=/etc/cvm_deploy.conf src=/etc/cvm_deploy.conf Jan 23 20:24:38 managed-node1 sshd-session[8274]: Accepted publickey for root from 10.31.43.8 port 59968 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 23 20:24:38 managed-node1 systemd-logind[610]: New session 8 of user root. ░░ Subject: A new session 8 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 8 has been created for the user root. ░░ ░░ The leading process of the session is 8274. Jan 23 20:24:38 managed-node1 systemd[1]: Started Session 8 of User root. ░░ Subject: A start job for unit session-8.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-8.scope has finished successfully. ░░ ░░ The job identifier is 1110. Jan 23 20:24:38 managed-node1 sshd-session[8274]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Jan 23 20:24:38 managed-node1 sshd-session[8277]: Received disconnect from 10.31.43.8 port 59968:11: disconnected by user Jan 23 20:24:38 managed-node1 sshd-session[8277]: Disconnected from user root 10.31.43.8 port 59968 Jan 23 20:24:38 managed-node1 sshd-session[8274]: pam_unix(sshd:session): session closed for user root Jan 23 20:24:38 managed-node1 systemd[1]: session-8.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-8.scope has successfully entered the 'dead' state. Jan 23 20:24:38 managed-node1 systemd-logind[610]: Session 8 logged out. Waiting for processes to exit. Jan 23 20:24:38 managed-node1 systemd-logind[610]: Removed session 8. ░░ Subject: Session 8 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 8 has been terminated. Jan 23 20:24:38 managed-node1 sshd-session[8302]: Accepted publickey for root from 10.31.43.8 port 59970 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Jan 23 20:24:38 managed-node1 systemd-logind[610]: New session 9 of user root. ░░ Subject: A new session 9 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 9 has been created for the user root. ░░ ░░ The leading process of the session is 8302. Jan 23 20:24:38 managed-node1 systemd[1]: Started Session 9 of User root. ░░ Subject: A start job for unit session-9.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-9.scope has finished successfully. ░░ ░░ The job identifier is 1179. Jan 23 20:24:38 managed-node1 sshd-session[8302]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)