ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-x20 executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_wrong_provider.yml ********************************************* 1 plays in /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml PLAY [Test issuing certificate with nonexistent provider] ********************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml:2 Saturday 11 October 2025 08:29:18 -0400 (0:00:00.019) 0:00:00.019 ****** [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node2] TASK [fedora.linux_system_roles.certificate : Set version specific variables] *** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:2 Saturday 11 October 2025 08:29:19 -0400 (0:00:01.075) 0:00:01.095 ****** included: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.certificate : Ensure ansible_facts used by role] *** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:2 Saturday 11 October 2025 08:29:19 -0400 (0:00:00.022) 0:00:01.118 ****** skipping: [managed-node2] => { "changed": false, "false_condition": "__certificate_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.certificate : Check if system is ostree] ******* task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:10 Saturday 11 October 2025 08:29:19 -0400 (0:00:00.035) 0:00:01.153 ****** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.certificate : Set flag to indicate system is ostree] *** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:15 Saturday 11 October 2025 08:29:19 -0400 (0:00:00.431) 0:00:01.584 ****** ok: [managed-node2] => { "ansible_facts": { "__certificate_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.certificate : Run systemctl] ******************* task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:22 Saturday 11 October 2025 08:29:19 -0400 (0:00:00.023) 0:00:01.608 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "systemctl", "is-system-running" ], "delta": "0:00:00.008545", "end": "2025-10-11 08:29:20.363673", "failed_when_result": false, "rc": 0, "start": "2025-10-11 08:29:20.355128" } STDOUT: running TASK [fedora.linux_system_roles.certificate : Require installed systemd] ******* task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:30 Saturday 11 October 2025 08:29:20 -0400 (0:00:00.480) 0:00:02.088 ****** skipping: [managed-node2] => { "changed": false, "false_condition": "\"No such file or directory\" in __is_system_running.msg | d(\"\")", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.certificate : Set flag to indicate that systemd runtime operations are available] *** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:35 Saturday 11 October 2025 08:29:20 -0400 (0:00:00.039) 0:00:02.127 ****** ok: [managed-node2] => { "ansible_facts": { "__certificate_is_booted": true }, "changed": false } TASK [fedora.linux_system_roles.certificate : Set platform/version specific variables] *** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:40 Saturday 11 October 2025 08:29:20 -0400 (0:00:00.023) 0:00:02.151 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "__certificate_certmonger_packages": [ "certmonger", "python3-packaging" ] }, "ansible_included_var_files": [ "/tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node2] => (item=CentOS_9.yml) => { "ansible_facts": { "__certificate_certmonger_packages": [ "certmonger", "python3-packaging" ] }, "ansible_included_var_files": [ "/tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.certificate : Ensure certificate role dependencies are installed] *** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5 Saturday 11 October 2025 08:29:20 -0400 (0:00:00.041) 0:00:02.192 ****** fatal: [managed-node2]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried TASK [Assert...] *************************************************************** task path: /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml:24 Saturday 11 October 2025 08:29:24 -0400 (0:00:04.315) 0:00:06.508 ****** fatal: [managed-node2]: FAILED! => {} MSG: The task includes an option with an undefined variable.. list object has no element 0 The error appears to be in '/tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml': line 24, column 11, but may be elsewhere in the file depending on the exact syntax problem. The offending line appears to be: rescue: - name: Assert... ^ here PLAY RECAP ********************************************************************* managed-node2 : ok=7 changed=0 unreachable=0 failed=1 skipped=2 rescued=1 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-11T12:29:24.830201+00:00Z", "host": "managed-node2", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-11T12:29:20.519661+00:00Z", "task_name": "Ensure certificate role dependencies are installed", "task_path": "/tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5" }, { "ansible_version": "2.17.14", "end_time": "2025-10-11T12:29:24.846037+00:00Z", "host": "managed-node2", "message": "The task includes an option with an undefined variable.. list object has no element 0\n\nThe error appears to be in '/tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml': line 24, column 11, but may\nbe elsewhere in the file depending on the exact syntax problem.\n\nThe offending line appears to be:\n\n rescue:\n - name: Assert...\n ^ here\n", "start_time": "2025-10-11T12:29:24.835148+00:00Z", "task_name": "Assert...", "task_path": "/tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml:24" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 11 October 2025 08:29:24 -0400 (0:00:00.012) 0:00:06.520 ****** =============================================================================== fedora.linux_system_roles.certificate : Ensure certificate role dependencies are installed --- 4.32s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:5 Gathering Facts --------------------------------------------------------- 1.08s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml:2 fedora.linux_system_roles.certificate : Run systemctl ------------------- 0.48s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:22 fedora.linux_system_roles.certificate : Check if system is ostree ------- 0.43s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:10 fedora.linux_system_roles.certificate : Set platform/version specific variables --- 0.04s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:40 fedora.linux_system_roles.certificate : Require installed systemd ------- 0.04s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:30 fedora.linux_system_roles.certificate : Ensure ansible_facts used by role --- 0.04s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:2 fedora.linux_system_roles.certificate : Set flag to indicate that systemd runtime operations are available --- 0.02s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:35 fedora.linux_system_roles.certificate : Set flag to indicate system is ostree --- 0.02s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/set_vars.yml:15 fedora.linux_system_roles.certificate : Set version specific variables --- 0.02s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/roles/certificate/tasks/main.yml:2 Assert... --------------------------------------------------------------- 0.01s /tmp/collections-x20/ansible_collections/fedora/linux_system_roles/tests/certificate/tests_wrong_provider.yml:24 Oct 11 08:29:17 managed-node2 sudo[12230]: pam_unix(sudo:session): session closed for user root Oct 11 08:29:17 managed-node2 sshd-session[12290]: Accepted publickey for root from 10.31.40.56 port 49580 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 08:29:17 managed-node2 systemd-logind[636]: New session 18 of user root. ░░ Subject: A new session 18 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 18 has been created for the user root. ░░ ░░ The leading process of the session is 12290. Oct 11 08:29:17 managed-node2 systemd[1]: Started Session 18 of User root. ░░ Subject: A start job for unit session-18.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-18.scope has finished successfully. ░░ ░░ The job identifier is 1798. Oct 11 08:29:17 managed-node2 sshd-session[12290]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 08:29:17 managed-node2 sshd-session[12293]: Received disconnect from 10.31.40.56 port 49580:11: disconnected by user Oct 11 08:29:17 managed-node2 sshd-session[12293]: Disconnected from user root 10.31.40.56 port 49580 Oct 11 08:29:17 managed-node2 sshd-session[12290]: pam_unix(sshd:session): session closed for user root Oct 11 08:29:17 managed-node2 systemd[1]: session-18.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-18.scope has successfully entered the 'dead' state. Oct 11 08:29:17 managed-node2 systemd-logind[636]: Session 18 logged out. Waiting for processes to exit. Oct 11 08:29:17 managed-node2 systemd-logind[636]: Removed session 18. ░░ Subject: Session 18 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 18 has been terminated. Oct 11 08:29:17 managed-node2 sshd-session[12318]: Accepted publickey for root from 10.31.40.56 port 49590 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 08:29:17 managed-node2 systemd-logind[636]: New session 19 of user root. ░░ Subject: A new session 19 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 19 has been created for the user root. ░░ ░░ The leading process of the session is 12318. Oct 11 08:29:17 managed-node2 systemd[1]: Started Session 19 of User root. ░░ Subject: A start job for unit session-19.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-19.scope has finished successfully. ░░ ░░ The job identifier is 1867. Oct 11 08:29:17 managed-node2 sshd-session[12318]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 08:29:17 managed-node2 sshd-session[12321]: Received disconnect from 10.31.40.56 port 49590:11: disconnected by user Oct 11 08:29:17 managed-node2 sshd-session[12321]: Disconnected from user root 10.31.40.56 port 49590 Oct 11 08:29:17 managed-node2 sshd-session[12318]: pam_unix(sshd:session): session closed for user root Oct 11 08:29:17 managed-node2 systemd[1]: session-19.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-19.scope has successfully entered the 'dead' state. Oct 11 08:29:17 managed-node2 systemd-logind[636]: Session 19 logged out. Waiting for processes to exit. Oct 11 08:29:17 managed-node2 systemd-logind[636]: Removed session 19. ░░ Subject: Session 19 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 19 has been terminated. Oct 11 08:29:19 managed-node2 python3.9[12519]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 11 08:29:19 managed-node2 python3.9[12696]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 11 08:29:20 managed-node2 python3.9[12845]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None Oct 11 08:29:21 managed-node2 python3.9[12995]: ansible-ansible.legacy.dnf Invoked with name=['python3-pyasn1', 'python3-cryptography', 'python3-dbus'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 11 08:29:25 managed-node2 sshd-session[13053]: Accepted publickey for root from 10.31.40.56 port 59638 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 08:29:25 managed-node2 systemd-logind[636]: New session 20 of user root. ░░ Subject: A new session 20 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 20 has been created for the user root. ░░ ░░ The leading process of the session is 13053. Oct 11 08:29:25 managed-node2 systemd[1]: Started Session 20 of User root. ░░ Subject: A start job for unit session-20.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-20.scope has finished successfully. ░░ ░░ The job identifier is 1936. Oct 11 08:29:25 managed-node2 sshd-session[13053]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 08:29:25 managed-node2 sshd-session[13056]: Received disconnect from 10.31.40.56 port 59638:11: disconnected by user Oct 11 08:29:25 managed-node2 sshd-session[13056]: Disconnected from user root 10.31.40.56 port 59638 Oct 11 08:29:25 managed-node2 sshd-session[13053]: pam_unix(sshd:session): session closed for user root Oct 11 08:29:25 managed-node2 systemd[1]: session-20.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-20.scope has successfully entered the 'dead' state. Oct 11 08:29:25 managed-node2 systemd-logind[636]: Session 20 logged out. Waiting for processes to exit. Oct 11 08:29:25 managed-node2 systemd-logind[636]: Removed session 20. ░░ Subject: Session 20 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 20 has been terminated. Oct 11 08:29:25 managed-node2 sshd-session[13081]: Accepted publickey for root from 10.31.40.56 port 59644 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 08:29:25 managed-node2 systemd-logind[636]: New session 21 of user root. ░░ Subject: A new session 21 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 21 has been created for the user root. ░░ ░░ The leading process of the session is 13081. Oct 11 08:29:25 managed-node2 systemd[1]: Started Session 21 of User root. ░░ Subject: A start job for unit session-21.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-21.scope has finished successfully. ░░ ░░ The job identifier is 2005. Oct 11 08:29:25 managed-node2 sshd-session[13081]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)