ansible-playbook [core 2.17.14] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-O8Y executable location = /usr/local/bin/ansible-playbook python version = 3.12.11 (main, Aug 14 2025, 00:00:00) [GCC 11.5.0 20240719 (Red Hat 11.5.0-11)] (/usr/bin/python3.12) jinja version = 3.1.6 libyaml = True No config file found; using defaults running playbook inside collection fedora.linux_system_roles Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'jsonl', as we already have a stdout callback. Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_skip_toolkit.yml *********************************************** 1 plays in /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml PLAY [Verify if role configures a custom storage properly] ********************* TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:3 Wednesday 17 September 2025 11:06:33 -0400 (0:00:00.019) 0:00:00.019 *** [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.9, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. ok: [managed-node1] TASK [Ensure test packages] **************************************************** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:22 Wednesday 17 September 2025 11:06:34 -0400 (0:00:01.090) 0:00:01.109 *** fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'rhel-9-for-x86_64-appstream-rhui-rpms': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried TASK [Remove both of the LVM logical volumes in 'foo' created above] *********** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:93 Wednesday 17 September 2025 11:06:35 -0400 (0:00:00.946) 0:00:02.056 *** included: fedora.linux_system_roles.storage for managed-node1 TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 17 September 2025 11:06:35 -0400 (0:00:00.024) 0:00:02.081 *** included: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 17 September 2025 11:06:35 -0400 (0:00:00.020) 0:00:02.101 *** skipping: [managed-node1] => { "changed": false, "false_condition": "__storage_required_facts | difference(ansible_facts.keys() | list) | length > 0", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 17 September 2025 11:06:35 -0400 (0:00:00.034) 0:00:02.135 *** skipping: [managed-node1] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node1] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "false_condition": "__vars_file is file", "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } ok: [managed-node1] => (item=CentOS_9.yml) => { "ansible_facts": { "blivet_package_list": [ "python3-blivet", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "vdo", "kmod-kvdo", "xfsprogs", "stratisd", "stratis-cli", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_9.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_9.yml" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 17 September 2025 11:06:35 -0400 (0:00:00.040) 0:00:02.176 *** ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 17 September 2025 11:06:36 -0400 (0:00:00.429) 0:00:02.606 *** ok: [managed-node1] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 17 September 2025 11:06:36 -0400 (0:00:00.022) 0:00:02.629 *** ok: [managed-node1] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 17 September 2025 11:06:36 -0400 (0:00:00.015) 0:00:02.644 *** ok: [managed-node1] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 17 September 2025 11:06:36 -0400 (0:00:00.015) 0:00:02.659 *** redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount redirecting (type: modules) ansible.builtin.mount to ansible.posix.mount included: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node1 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 17 September 2025 11:06:36 -0400 (0:00:00.043) 0:00:02.702 *** fatal: [managed-node1]: FAILED! => { "changed": false, "rc": 1, "results": [] } MSG: Failed to download metadata for repo 'rhel-9-for-x86_64-appstream-rhui-rpms': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried PLAY RECAP ********************************************************************* managed-node1 : ok=9 changed=0 unreachable=0 failed=2 skipped=1 rescued=0 ignored=0 SYSTEM ROLES ERRORS BEGIN v1 [ { "ansible_version": "2.17.14", "end_time": "2025-09-17T15:06:35.812063+00:00Z", "host": "managed-node1", "message": "Failed to download metadata for repo 'rhel-9-for-x86_64-appstream-rhui-rpms': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-09-17T15:06:34.870775+00:00Z", "task_name": "Ensure test packages", "task_path": "/tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:22" }, { "ansible_version": "2.17.14", "end_time": "2025-09-17T15:06:37.247934+00:00Z", "host": "managed-node1", "message": "Failed to download metadata for repo 'rhel-9-for-x86_64-appstream-rhui-rpms': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried", "rc": 1, "start_time": "2025-09-17T15:06:36.463743+00:00Z", "task_name": "Make sure blivet is available", "task_path": "/tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2" } ] SYSTEM ROLES ERRORS END v1 TASKS RECAP ******************************************************************** Wednesday 17 September 2025 11:06:37 -0400 (0:00:00.786) 0:00:03.488 *** =============================================================================== Gathering Facts --------------------------------------------------------- 1.09s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:3 Ensure test packages ---------------------------------------------------- 0.95s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:22 fedora.linux_system_roles.storage : Make sure blivet is available ------- 0.79s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Check if system is ostree ----------- 0.43s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 fedora.linux_system_roles.storage : Include the appropriate provider tasks --- 0.04s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 fedora.linux_system_roles.storage : Set platform/version specific variables --- 0.04s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 fedora.linux_system_roles.storage : Ensure ansible_facts used by role --- 0.03s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Remove both of the LVM logical volumes in 'foo' created above ----------- 0.02s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/tests/hpc/tests_skip_toolkit.yml:93 fedora.linux_system_roles.storage : Set flag to indicate system is ostree --- 0.02s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 fedora.linux_system_roles.storage : Set platform/version specific variables --- 0.02s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing --- 0.02s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing --- 0.02s /tmp/collections-O8Y/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Sep 17 11:06:33 managed-node1 sshd[10671]: Accepted publickey for root from 10.31.15.125 port 46136 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Sep 17 11:06:33 managed-node1 systemd-logind[609]: New session 16 of user root. ░░ Subject: A new session 16 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 16 has been created for the user root. ░░ ░░ The leading process of the session is 10671. Sep 17 11:06:33 managed-node1 systemd[1]: Started Session 16 of User root. ░░ Subject: A start job for unit session-16.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-16.scope has finished successfully. ░░ ░░ The job identifier is 1798. Sep 17 11:06:33 managed-node1 sshd[10671]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Sep 17 11:06:33 managed-node1 sshd[10674]: Received disconnect from 10.31.15.125 port 46136:11: disconnected by user Sep 17 11:06:33 managed-node1 sshd[10674]: Disconnected from user root 10.31.15.125 port 46136 Sep 17 11:06:33 managed-node1 sshd[10671]: pam_unix(sshd:session): session closed for user root Sep 17 11:06:33 managed-node1 systemd-logind[609]: Session 16 logged out. Waiting for processes to exit. Sep 17 11:06:33 managed-node1 systemd[1]: session-16.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-16.scope has successfully entered the 'dead' state. Sep 17 11:06:33 managed-node1 systemd-logind[609]: Removed session 16. ░░ Subject: Session 16 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 16 has been terminated. Sep 17 11:06:33 managed-node1 sshd[10699]: Accepted publickey for root from 10.31.15.125 port 46138 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Sep 17 11:06:33 managed-node1 systemd-logind[609]: New session 17 of user root. ░░ Subject: A new session 17 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 17 has been created for the user root. ░░ ░░ The leading process of the session is 10699. Sep 17 11:06:33 managed-node1 systemd[1]: Started Session 17 of User root. ░░ Subject: A start job for unit session-17.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-17.scope has finished successfully. ░░ ░░ The job identifier is 1867. Sep 17 11:06:33 managed-node1 sshd[10699]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Sep 17 11:06:33 managed-node1 sshd[10702]: Received disconnect from 10.31.15.125 port 46138:11: disconnected by user Sep 17 11:06:33 managed-node1 sshd[10702]: Disconnected from user root 10.31.15.125 port 46138 Sep 17 11:06:33 managed-node1 sshd[10699]: pam_unix(sshd:session): session closed for user root Sep 17 11:06:33 managed-node1 systemd[1]: session-17.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-17.scope has successfully entered the 'dead' state. Sep 17 11:06:33 managed-node1 systemd-logind[609]: Session 17 logged out. Waiting for processes to exit. Sep 17 11:06:33 managed-node1 systemd-logind[609]: Removed session 17. ░░ Subject: Session 17 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 17 has been terminated. Sep 17 11:06:34 managed-node1 python3.9[10900]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d Sep 17 11:06:35 managed-node1 python3.9[11077]: ansible-ansible.legacy.dnf Invoked with name=['util-linux-core'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Sep 17 11:06:36 managed-node1 python3.9[11228]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1 Sep 17 11:06:36 managed-node1 python3.9[11377]: ansible-ansible.legacy.dnf Invoked with name=['python3-blivet', 'libblockdev-crypto', 'libblockdev-dm', 'libblockdev-lvm', 'libblockdev-mdraid', 'libblockdev-swap', 'vdo', 'kmod-kvdo', 'xfsprogs', 'stratisd', 'stratis-cli', 'libblockdev'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None Sep 17 11:06:37 managed-node1 sshd[11404]: Accepted publickey for root from 10.31.15.125 port 46146 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Sep 17 11:06:37 managed-node1 systemd-logind[609]: New session 18 of user root. ░░ Subject: A new session 18 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 18 has been created for the user root. ░░ ░░ The leading process of the session is 11404. Sep 17 11:06:37 managed-node1 systemd[1]: Started Session 18 of User root. ░░ Subject: A start job for unit session-18.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-18.scope has finished successfully. ░░ ░░ The job identifier is 1936. Sep 17 11:06:37 managed-node1 sshd[11404]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0) Sep 17 11:06:37 managed-node1 sshd[11407]: Received disconnect from 10.31.15.125 port 46146:11: disconnected by user Sep 17 11:06:37 managed-node1 sshd[11407]: Disconnected from user root 10.31.15.125 port 46146 Sep 17 11:06:37 managed-node1 sshd[11404]: pam_unix(sshd:session): session closed for user root Sep 17 11:06:37 managed-node1 systemd[1]: session-18.scope: Deactivated successfully. ░░ Subject: Unit succeeded ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ The unit session-18.scope has successfully entered the 'dead' state. Sep 17 11:06:37 managed-node1 systemd-logind[609]: Session 18 logged out. Waiting for processes to exit. Sep 17 11:06:37 managed-node1 systemd-logind[609]: Removed session 18. ░░ Subject: Session 18 has been terminated ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A session with the ID 18 has been terminated. Sep 17 11:06:37 managed-node1 sshd[11432]: Accepted publickey for root from 10.31.15.125 port 46148 ssh2: RSA SHA256:9j1blwt3wcrRiGYZQ7ZGu9axm3cDklH6/z4c+Ee8CzE Sep 17 11:06:37 managed-node1 systemd-logind[609]: New session 19 of user root. ░░ Subject: A new session 19 has been created for user root ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ Documentation: sd-login(3) ░░ ░░ A new session with the ID 19 has been created for the user root. ░░ ░░ The leading process of the session is 11432. Sep 17 11:06:37 managed-node1 systemd[1]: Started Session 19 of User root. ░░ Subject: A start job for unit session-19.scope has finished successfully ░░ Defined-By: systemd ░░ Support: https://access.redhat.com/support ░░ ░░ A start job for unit session-19.scope has finished successfully. ░░ ░░ The job identifier is 2005. Sep 17 11:06:37 managed-node1 sshd[11432]: pam_unix(sshd:session): session opened for user root(uid=0) by root(uid=0)