ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-M9c executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_global_config.yml ********************************************** 1 plays in /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tests_global_config.yml PLAY [Test we can write global config with default configuration] ************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tests_global_config.yml:2 Saturday 11 October 2025 17:14:59 -0400 (0:00:00.022) 0:00:00.022 ****** [WARNING]: Platform linux on host managed-node3 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node3] TASK [Backup configuration files] ********************************************** task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tests_global_config.yml:11 Saturday 11 October 2025 17:15:00 -0400 (0:00:01.040) 0:00:01.063 ****** included: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/backup.yml for managed-node3 TASK [Setup] ******************************************************************* task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/backup.yml:2 Saturday 11 October 2025 17:15:00 -0400 (0:00:00.034) 0:00:01.097 ****** included: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml for managed-node3 TASK [Ensure facts used by test] *********************************************** task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:2 Saturday 11 October 2025 17:15:00 -0400 (0:00:00.019) 0:00:01.117 ****** skipping: [managed-node3] => { "changed": false, "false_condition": "'os_family' not in ansible_facts", "skip_reason": "Conditional result was False" } TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:10 Saturday 11 October 2025 17:15:00 -0400 (0:00:00.013) 0:00:01.131 ****** ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:15 Saturday 11 October 2025 17:15:01 -0400 (0:00:00.430) 0:00:01.562 ****** ok: [managed-node3] => { "ansible_facts": { "__ssh_is_ostree": false }, "changed": false } TASK [Make sure openssh is installed before creating backup] ******************* task path: /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:19 Saturday 11 October 2025 17:15:01 -0400 (0:00:00.021) 0:00:01.584 ****** fatal: [managed-node3]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried PLAY RECAP ********************************************************************* managed-node3 : ok=5 changed=0 unreachable=0 failed=1 skipped=1 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-10-11T21:15:05.715465+00:00Z", "host": "managed-node3", "message": "Failed to download metadata for repo 'highavailability': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-10-11T21:15:01.403477+00:00Z", "task_name": "Make sure openssh is installed before creating backup", "task_path": "/tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:19" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Saturday 11 October 2025 17:15:05 -0400 (0:00:04.313) 0:00:05.897 ****** =============================================================================== Make sure openssh is installed before creating backup ------------------- 4.31s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:19 Gathering Facts --------------------------------------------------------- 1.04s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tests_global_config.yml:2 Check if system is ostree ----------------------------------------------- 0.43s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:10 Backup configuration files ---------------------------------------------- 0.03s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tests_global_config.yml:11 Set flag to indicate system is ostree ----------------------------------- 0.02s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:15 Setup ------------------------------------------------------------------- 0.02s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/backup.yml:2 Ensure facts used by test ----------------------------------------------- 0.01s /tmp/collections-M9c/ansible_collections/fedora/linux_system_roles/tests/ssh/tasks/setup.yml:2 Oct 11 17:14:59 managed-node3 sshd-session[9024]: Received disconnect from 10.31.10.97 port 55874:11: disconnected by user Oct 11 17:14:59 managed-node3 sshd-session[9024]: Disconnected from user root 10.31.10.97 port 55874 Oct 11 17:14:59 managed-node3 sshd-session[9021]: pam_unix(sshd:session): session closed for user root Oct 11 17:14:59 managed-node3 systemd[1]: session-12.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-12.scope has successfully entered the 'dead' state. Oct 11 17:14:59 managed-node3 systemd-logind[592]: Session 12 logged out. Waiting for processes to exit. Oct 11 17:14:59 managed-node3 systemd-logind[592]: Removed session 12. ░░ Subject: Session 12 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 12 has been terminated. Oct 11 17:15:00 managed-node3 python3.9[9222]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Oct 11 17:15:01 managed-node3 python3.9[9397]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Oct 11 17:15:01 managed-node3 python3.9[9546]: ansible-ansible.legacy.dnf Invoked with name=['openssh-clients'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Oct 11 17:15:05 managed-node3 sshd-session[9604]: Accepted publickey for root from 10.31.10.97 port 51208 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 17:15:05 managed-node3 systemd-logind[592]: New session 13 of user root. ░░ Subject: A new session 13 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 13 has been created for the user root. ░░ ░░ The leading process of the session is 9604. Oct 11 17:15:05 managed-node3 systemd[1]: Started Session 13 of User root. ░░ Subject: A start job for unit session-13.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-13.scope has finished successfully. ░░ ░░ The job identifier is 1457. Oct 11 17:15:05 managed-node3 sshd-session[9604]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Oct 11 17:15:05 managed-node3 sshd-session[9607]: Received disconnect from 10.31.10.97 port 51208:11: disconnected by user Oct 11 17:15:05 managed-node3 sshd-session[9607]: Disconnected from user root 10.31.10.97 port 51208 Oct 11 17:15:05 managed-node3 sshd-session[9604]: pam_unix(sshd:session): session closed for user root Oct 11 17:15:05 managed-node3 systemd[1]: session-13.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-13.scope has successfully entered the 'dead' state. Oct 11 17:15:05 managed-node3 systemd-logind[592]: Session 13 logged out. Waiting for processes to exit. Oct 11 17:15:05 managed-node3 systemd-logind[592]: Removed session 13. ░░ Subject: Session 13 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 13 has been terminated. Oct 11 17:15:06 managed-node3 sshd-session[9632]: Accepted publickey for root from 10.31.10.97 port 51214 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Oct 11 17:15:06 managed-node3 systemd-logind[592]: New session 14 of user root. ░░ Subject: A new session 14 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 14 has been created for the user root. ░░ ░░ The leading process of the session is 9632. Oct 11 17:15:06 managed-node3 systemd[1]: Started Session 14 of User root. ░░ Subject: A start job for unit session-14.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-14.scope has finished successfully. ░░ ░░ The job identifier is 1526. Oct 11 17:15:06 managed-node3 sshd-session[9632]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)