ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Wednesday 11 December 2024 15:15:22 -0500 (0:00:00.043) 0:00:00.043 **** ok: [managed-node3] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Wednesday 11 December 2024 15:15:23 -0500 (0:00:01.310) 0:00:01.353 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:24 Wednesday 11 December 2024 15:15:23 -0500 (0:00:00.078) 0:00:01.432 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:34 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.100) 0:00:01.533 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:40 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.076) 0:00:01.610 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:49 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.088) 0:00:01.698 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.094) 0:00:01.793 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.101) 0:00:01.894 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.093) 0:00:01.988 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.102) 0:00:02.090 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:15:24 -0500 (0:00:00.154) 0:00:02.245 **** ok: [managed-node3] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:15:25 -0500 (0:00:00.997) 0:00:03.243 **** ok: [managed-node3] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:15:25 -0500 (0:00:00.208) 0:00:03.452 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:15:26 -0500 (0:00:00.078) 0:00:03.530 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:15:26 -0500 (0:00:00.039) 0:00:03.570 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:15:26 -0500 (0:00:00.187) 0:00:03.757 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:15:27 -0500 (0:00:01.572) 0:00:05.330 **** ok: [managed-node3] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:15:27 -0500 (0:00:00.105) 0:00:05.436 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:15:27 -0500 (0:00:00.076) 0:00:05.513 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:15:28 -0500 (0:00:00.929) 0:00:06.443 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:15:29 -0500 (0:00:00.147) 0:00:06.590 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:15:29 -0500 (0:00:00.035) 0:00:06.626 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:15:29 -0500 (0:00:00.039) 0:00:06.665 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:15:29 -0500 (0:00:00.038) 0:00:06.703 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:15:29 -0500 (0:00:00.727) 0:00:07.431 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:15:31 -0500 (0:00:01.342) 0:00:08.774 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:15:31 -0500 (0:00:00.079) 0:00:08.853 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:15:31 -0500 (0:00:00.049) 0:00:08.903 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:15:31 -0500 (0:00:00.546) 0:00:09.449 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:15:31 -0500 (0:00:00.057) 0:00:09.506 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948105.7832756, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1733948105.069275, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948105.069275, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.416) 0:00:09.923 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.041) 0:00:09.964 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.031) 0:00:09.996 **** ok: [managed-node3] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.038) 0:00:10.035 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.039) 0:00:10.074 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.040) 0:00:10.115 **** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.034) 0:00:10.149 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.035) 0:00:10.184 **** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.033) 0:00:10.217 **** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.032) 0:00:10.250 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:15:32 -0500 (0:00:00.034) 0:00:10.285 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733946736.7818234, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733946734.6568215, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1733946734.6568215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071595669610", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:15:33 -0500 (0:00:00.330) 0:00:10.615 **** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:15:33 -0500 (0:00:00.031) 0:00:10.647 **** ok: [managed-node3] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:57 Wednesday 11 December 2024 15:15:33 -0500 (0:00:00.697) 0:00:11.345 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node3 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Wednesday 11 December 2024 15:15:33 -0500 (0:00:00.105) 0:00:11.451 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Wednesday 11 December 2024 15:15:34 -0500 (0:00:00.637) 0:00:12.089 **** ok: [managed-node3] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.465) 0:00:12.555 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.034) 0:00:12.589 **** ok: [managed-node3] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.041) 0:00:12.631 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.035) 0:00:12.667 **** ok: [managed-node3] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:66 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.040) 0:00:12.708 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.079) 0:00:12.788 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.042) 0:00:12.830 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.062) 0:00:12.892 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.056) 0:00:12.949 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.041) 0:00:12.990 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.085) 0:00:13.075 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.037) 0:00:13.113 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.034) 0:00:13.147 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.035) 0:00:13.183 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.035) 0:00:13.219 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:15:35 -0500 (0:00:00.087) 0:00:13.306 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:15:37 -0500 (0:00:01.219) 0:00:14.526 **** ok: [managed-node3] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:15:37 -0500 (0:00:00.063) 0:00:14.589 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:15:37 -0500 (0:00:00.069) 0:00:14.658 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:15:41 -0500 (0:00:04.072) 0:00:18.731 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:15:41 -0500 (0:00:00.091) 0:00:18.822 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:15:41 -0500 (0:00:00.045) 0:00:18.868 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:15:41 -0500 (0:00:00.052) 0:00:18.920 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:15:41 -0500 (0:00:00.064) 0:00:18.984 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:15:42 -0500 (0:00:00.810) 0:00:19.795 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:15:43 -0500 (0:00:01.356) 0:00:21.151 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:15:43 -0500 (0:00:00.091) 0:00:21.243 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:15:43 -0500 (0:00:00.040) 0:00:21.283 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:15:47 -0500 (0:00:04.230) 0:00:25.513 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.117) 0:00:25.631 **** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.096) 0:00:25.728 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.103) 0:00:25.832 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.069) 0:00:25.901 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:81 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.055) 0:00:25.957 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.130) 0:00:26.087 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.083) 0:00:26.170 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.059) 0:00:26.230 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.121) 0:00:26.351 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.049) 0:00:26.401 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.047) 0:00:26.448 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:15:48 -0500 (0:00:00.057) 0:00:26.505 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:15:49 -0500 (0:00:00.067) 0:00:26.573 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:15:49 -0500 (0:00:00.159) 0:00:26.732 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:15:50 -0500 (0:00:01.322) 0:00:28.054 **** ok: [managed-node3] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:15:50 -0500 (0:00:00.034) 0:00:28.089 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:15:50 -0500 (0:00:00.037) 0:00:28.127 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:15:54 -0500 (0:00:03.812) 0:00:31.939 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:15:54 -0500 (0:00:00.089) 0:00:32.029 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:15:54 -0500 (0:00:00.046) 0:00:32.075 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:15:54 -0500 (0:00:00.040) 0:00:32.116 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:15:54 -0500 (0:00:00.035) 0:00:32.152 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:15:55 -0500 (0:00:00.671) 0:00:32.824 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:15:56 -0500 (0:00:01.066) 0:00:33.890 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:15:56 -0500 (0:00:00.086) 0:00:33.976 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:15:56 -0500 (0:00:00.051) 0:00:34.028 **** changed: [managed-node3] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:16:06 -0500 (0:00:10.167) 0:00:44.195 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:16:06 -0500 (0:00:00.034) 0:00:44.230 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948105.7832756, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1733948105.069275, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948105.069275, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.335) 0:00:44.566 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.482) 0:00:45.048 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.030) 0:00:45.078 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.045) 0:00:45.124 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.037) 0:00:45.161 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.055) 0:00:45.216 **** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:16:07 -0500 (0:00:00.033) 0:00:45.249 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:16:08 -0500 (0:00:00.799) 0:00:46.049 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:16:09 -0500 (0:00:00.626) 0:00:46.675 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:16:09 -0500 (0:00:00.050) 0:00:46.726 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:16:09 -0500 (0:00:00.616) 0:00:47.342 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733946736.7818234, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733946734.6568215, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263648, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1733946734.6568215, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071595669610", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:16:10 -0500 (0:00:00.462) 0:00:47.805 **** changed: [managed-node3] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:16:10 -0500 (0:00:00.473) 0:00:48.278 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:93 Wednesday 11 December 2024 15:16:11 -0500 (0:00:00.739) 0:00:49.018 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:16:11 -0500 (0:00:00.100) 0:00:49.119 **** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:16:11 -0500 (0:00:00.050) 0:00:49.169 **** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:16:11 -0500 (0:00:00.067) 0:00:49.236 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "size": "10G", "type": "crypt", "uuid": "5c16b22a-6eb6-4fad-bb89-5ebbe851a5c5" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "7eec0a51-6822-4ebd-a2f8-2ec54b2ec255" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:16:12 -0500 (0:00:00.674) 0:00:49.911 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003027", "end": "2024-12-11 15:16:12.963267", "rc": 0, "start": "2024-12-11 15:16:12.960240" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:16:13 -0500 (0:00:00.663) 0:00:50.574 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002970", "end": "2024-12-11 15:16:13.320776", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:16:13.317806" } STDOUT: luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:16:13 -0500 (0:00:00.327) 0:00:50.901 **** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:16:13 -0500 (0:00:00.033) 0:00:50.935 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:16:13 -0500 (0:00:00.109) 0:00:51.044 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:16:13 -0500 (0:00:00.122) 0:00:51.167 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.435) 0:00:51.603 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.063) 0:00:51.667 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.063) 0:00:51.730 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.057) 0:00:51.788 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.078) 0:00:51.866 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.055) 0:00:51.921 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.066) 0:00:51.988 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.054) 0:00:52.042 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.055) 0:00:52.098 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.047) 0:00:52.146 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.043) 0:00:52.190 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.042) 0:00:52.233 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.101) 0:00:52.334 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.065) 0:00:52.400 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:16:14 -0500 (0:00:00.112) 0:00:52.512 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.060) 0:00:52.573 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.067) 0:00:52.641 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.057) 0:00:52.698 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.076) 0:00:52.775 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.062) 0:00:52.837 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948166.435341, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948166.435341, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28762, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948166.435341, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.465) 0:00:53.303 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.068) 0:00:53.371 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.055) 0:00:53.426 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:16:15 -0500 (0:00:00.067) 0:00:53.494 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:16:16 -0500 (0:00:00.130) 0:00:53.624 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:16:16 -0500 (0:00:00.101) 0:00:53.725 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:16:16 -0500 (0:00:00.103) 0:00:53.829 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948166.562341, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948166.562341, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 385976, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948166.562341, "nlink": 1, "path": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:16:16 -0500 (0:00:00.607) 0:00:54.436 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:16:17 -0500 (0:00:00.820) 0:00:55.256 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.026546", "end": "2024-12-11 15:16:18.277480", "rc": 0, "start": "2024-12-11 15:16:18.250934" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 7e 01 b6 37 08 17 9a 2a 63 db c6 2f 16 75 e8 48 fa 75 25 38 MK salt: 8a b7 b3 70 92 d1 20 94 f1 bd de 39 57 02 8c a3 ec e7 51 c0 3e 94 b7 3b 68 72 fc 19 75 04 8f a2 MK iterations: 23108 UUID: 7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 Key Slot 0: ENABLED Iterations: 370258 Salt: d0 ec ba eb 8b 8f cc 75 3b 13 7e b8 00 7a 60 20 45 bc fa c1 69 9b f6 e8 1f fe ab fa 77 73 0a 1e Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.693) 0:00:55.950 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.078) 0:00:56.028 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.090) 0:00:56.118 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.117) 0:00:56.236 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.067) 0:00:56.304 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.066) 0:00:56.370 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:16:18 -0500 (0:00:00.057) 0:00:56.427 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.094) 0:00:56.522 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.166) 0:00:56.689 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.077) 0:00:56.766 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.083) 0:00:56.850 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.089) 0:00:56.939 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.084) 0:00:57.024 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.076) 0:00:57.101 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.090) 0:00:57.191 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.117) 0:00:57.309 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.082) 0:00:57.392 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:16:19 -0500 (0:00:00.067) 0:00:57.459 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.068) 0:00:57.527 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.056) 0:00:57.584 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.070) 0:00:57.654 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.073) 0:00:57.728 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.055) 0:00:57.783 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.053) 0:00:57.837 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.057) 0:00:57.895 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.095) 0:00:57.990 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.082) 0:00:58.073 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.091) 0:00:58.165 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.091) 0:00:58.256 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.063) 0:00:58.319 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.094) 0:00:58.414 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:16:20 -0500 (0:00:00.098) 0:00:58.513 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.099) 0:00:58.612 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.067) 0:00:58.680 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.119) 0:00:58.800 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.117) 0:00:58.917 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.123) 0:00:59.041 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.086) 0:00:59.128 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.061) 0:00:59.190 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.059) 0:00:59.249 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.071) 0:00:59.320 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.063) 0:00:59.384 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:16:21 -0500 (0:00:00.088) 0:00:59.472 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.125) 0:00:59.598 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.105) 0:00:59.703 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.072) 0:00:59.776 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.072) 0:00:59.848 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.068) 0:00:59.916 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.062) 0:00:59.979 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.116) 0:01:00.095 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.137) 0:01:00.233 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.059) 0:01:00.293 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.065) 0:01:00.358 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.069) 0:01:00.428 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:16:22 -0500 (0:00:00.053) 0:01:00.481 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:16:23 -0500 (0:00:00.055) 0:01:00.536 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:16:23 -0500 (0:00:00.058) 0:01:00.595 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:16:23 -0500 (0:00:00.068) 0:01:00.663 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:16:23 -0500 (0:00:00.054) 0:01:00.718 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 11 December 2024 15:16:23 -0500 (0:00:00.062) 0:01:00.780 **** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.943) 0:01:01.724 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.179) 0:01:01.904 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.064) 0:01:01.968 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.095) 0:01:02.063 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.086) 0:01:02.149 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.074) 0:01:02.224 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.146) 0:01:02.370 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:16:24 -0500 (0:00:00.059) 0:01:02.429 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:16:25 -0500 (0:00:00.119) 0:01:02.548 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:16:25 -0500 (0:00:00.062) 0:01:02.611 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:16:25 -0500 (0:00:00.072) 0:01:02.683 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:16:25 -0500 (0:00:00.196) 0:01:02.879 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:16:26 -0500 (0:00:01.547) 0:01:04.426 **** ok: [managed-node3] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:16:26 -0500 (0:00:00.082) 0:01:04.509 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:16:27 -0500 (0:00:00.090) 0:01:04.600 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:16:30 -0500 (0:00:03.837) 0:01:08.437 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:16:31 -0500 (0:00:00.100) 0:01:08.538 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:16:31 -0500 (0:00:00.049) 0:01:08.588 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:16:31 -0500 (0:00:00.047) 0:01:08.635 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:16:31 -0500 (0:00:00.041) 0:01:08.677 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:16:31 -0500 (0:00:00.694) 0:01:09.371 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:16:32 -0500 (0:00:00.980) 0:01:10.352 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:16:32 -0500 (0:00:00.076) 0:01:10.428 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:16:32 -0500 (0:00:00.043) 0:01:10.472 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:16:37 -0500 (0:00:04.212) 0:01:14.685 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10733223936, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:16:37 -0500 (0:00:00.085) 0:01:14.770 **** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:16:37 -0500 (0:00:00.059) 0:01:14.829 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:16:37 -0500 (0:00:00.078) 0:01:14.908 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:16:37 -0500 (0:00:00.109) 0:01:15.018 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 11 December 2024 15:16:37 -0500 (0:00:00.056) 0:01:15.075 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948184.0413594, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948184.0413594, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1733948184.0413594, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1838965837", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.467) 0:01:15.543 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:119 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.102) 0:01:15.646 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.202) 0:01:15.848 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.107) 0:01:15.955 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.069) 0:01:16.025 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.187) 0:01:16.213 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.049) 0:01:16.263 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.045) 0:01:16.309 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.046) 0:01:16.355 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:16:38 -0500 (0:00:00.048) 0:01:16.404 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:16:39 -0500 (0:00:00.138) 0:01:16.542 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:16:40 -0500 (0:00:01.529) 0:01:18.072 **** ok: [managed-node3] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:16:40 -0500 (0:00:00.082) 0:01:18.155 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:16:40 -0500 (0:00:00.150) 0:01:18.305 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:16:44 -0500 (0:00:04.144) 0:01:22.449 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:16:45 -0500 (0:00:00.101) 0:01:22.551 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:16:45 -0500 (0:00:00.050) 0:01:22.602 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:16:45 -0500 (0:00:00.055) 0:01:22.658 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:16:45 -0500 (0:00:00.052) 0:01:22.711 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:16:46 -0500 (0:00:00.850) 0:01:23.562 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:16:47 -0500 (0:00:01.073) 0:01:24.635 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:16:47 -0500 (0:00:00.073) 0:01:24.709 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:16:47 -0500 (0:00:00.055) 0:01:24.764 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:16:51 -0500 (0:00:04.401) 0:01:29.166 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:16:51 -0500 (0:00:00.050) 0:01:29.217 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948169.0783436, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "8dfe327817fdde9cc0522a02e981b00d799df64b", "ctime": 1733948169.0753436, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948169.0753436, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:16:52 -0500 (0:00:00.387) 0:01:29.604 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:16:52 -0500 (0:00:00.376) 0:01:29.980 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:16:52 -0500 (0:00:00.041) 0:01:30.022 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:16:52 -0500 (0:00:00.068) 0:01:30.091 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:16:52 -0500 (0:00:00.052) 0:01:30.143 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:16:52 -0500 (0:00:00.057) 0:01:30.200 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:16:53 -0500 (0:00:00.391) 0:01:30.592 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:16:53 -0500 (0:00:00.586) 0:01:31.179 **** changed: [managed-node3] => (item={u'src': u'UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:16:54 -0500 (0:00:00.497) 0:01:31.676 **** skipping: [managed-node3] => (item={u'src': u'UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:16:54 -0500 (0:00:00.054) 0:01:31.731 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:16:54 -0500 (0:00:00.608) 0:01:32.339 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948173.319348, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "28944def70d8ef9bbbae11481dc65a22d38f9d05", "ctime": 1733948170.6763453, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263817, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948170.6763453, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744071595675069", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:16:55 -0500 (0:00:00.609) 0:01:32.948 **** changed: [managed-node3] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:16:55 -0500 (0:00:00.453) 0:01:33.401 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:132 Wednesday 11 December 2024 15:16:56 -0500 (0:00:00.958) 0:01:34.360 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:16:57 -0500 (0:00:00.215) 0:01:34.575 **** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:16:57 -0500 (0:00:00.087) 0:01:34.663 **** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:16:57 -0500 (0:00:00.081) 0:01:34.745 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "d741ed8a-cf73-4083-ab20-2c91b77b7f8c" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:16:57 -0500 (0:00:00.504) 0:01:35.249 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003027", "end": "2024-12-11 15:16:58.084545", "rc": 0, "start": "2024-12-11 15:16:58.081518" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:16:58 -0500 (0:00:00.471) 0:01:35.720 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002923", "end": "2024-12-11 15:16:58.528411", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:16:58.525488" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:16:58 -0500 (0:00:00.471) 0:01:36.192 **** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:16:58 -0500 (0:00:00.077) 0:01:36.269 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:16:58 -0500 (0:00:00.161) 0:01:36.431 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.150) 0:01:36.581 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.381) 0:01:36.963 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.108) 0:01:37.072 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.077) 0:01:37.150 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.059) 0:01:37.210 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.114) 0:01:37.324 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.083) 0:01:37.407 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:16:59 -0500 (0:00:00.073) 0:01:37.481 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.060) 0:01:37.541 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.057) 0:01:37.598 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.074) 0:01:37.673 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.070) 0:01:37.743 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.065) 0:01:37.808 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.127) 0:01:37.935 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.080) 0:01:38.016 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.075) 0:01:38.092 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.100) 0:01:38.192 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.098) 0:01:38.291 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:17:00 -0500 (0:00:00.079) 0:01:38.371 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:17:01 -0500 (0:00:00.151) 0:01:38.522 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:17:01 -0500 (0:00:00.160) 0:01:38.683 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948211.5193882, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948211.5193882, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28762, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948211.5193882, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:17:01 -0500 (0:00:00.804) 0:01:39.487 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.113) 0:01:39.602 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.086) 0:01:39.688 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.127) 0:01:39.816 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.082) 0:01:39.899 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.090) 0:01:39.989 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.091) 0:01:40.081 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:17:02 -0500 (0:00:00.098) 0:01:40.180 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:17:03 -0500 (0:00:00.902) 0:01:41.082 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:17:03 -0500 (0:00:00.075) 0:01:41.158 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:17:03 -0500 (0:00:00.126) 0:01:41.285 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:17:03 -0500 (0:00:00.111) 0:01:41.396 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:17:03 -0500 (0:00:00.094) 0:01:41.491 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.085) 0:01:41.576 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.057) 0:01:41.634 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.061) 0:01:41.696 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.096) 0:01:41.792 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.069) 0:01:41.862 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.076) 0:01:41.939 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.073) 0:01:42.013 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.054) 0:01:42.068 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.082) 0:01:42.150 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.074) 0:01:42.224 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.053) 0:01:42.277 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.054) 0:01:42.332 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.056) 0:01:42.388 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:17:04 -0500 (0:00:00.056) 0:01:42.445 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.092) 0:01:42.537 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.070) 0:01:42.607 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.069) 0:01:42.677 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.070) 0:01:42.747 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.115) 0:01:42.863 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.114) 0:01:42.977 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.132) 0:01:43.110 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.077) 0:01:43.187 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.123) 0:01:43.310 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.082) 0:01:43.393 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:17:05 -0500 (0:00:00.066) 0:01:43.460 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.067) 0:01:43.527 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.071) 0:01:43.598 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.062) 0:01:43.660 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.080) 0:01:43.741 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.054) 0:01:43.795 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.070) 0:01:43.866 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.082) 0:01:43.948 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.074) 0:01:44.023 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.104) 0:01:44.127 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.059) 0:01:44.187 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.069) 0:01:44.256 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.066) 0:01:44.323 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.082) 0:01:44.406 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:17:06 -0500 (0:00:00.088) 0:01:44.495 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.079) 0:01:44.575 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.077) 0:01:44.652 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.092) 0:01:44.745 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.082) 0:01:44.828 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.070) 0:01:44.898 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.088) 0:01:44.986 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.061) 0:01:45.047 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.061) 0:01:45.109 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.062) 0:01:45.171 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.054) 0:01:45.226 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.058) 0:01:45.284 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.059) 0:01:45.344 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:17:07 -0500 (0:00:00.142) 0:01:45.486 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:17:08 -0500 (0:00:00.063) 0:01:45.549 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:17:08 -0500 (0:00:00.067) 0:01:45.617 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:17:08 -0500 (0:00:00.061) 0:01:45.678 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 11 December 2024 15:17:08 -0500 (0:00:00.061) 0:01:45.739 **** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Wednesday 11 December 2024 15:17:08 -0500 (0:00:00.703) 0:01:46.443 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.177) 0:01:46.621 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.102) 0:01:46.723 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.087) 0:01:46.811 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.150) 0:01:46.961 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.079) 0:01:47.041 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.237) 0:01:47.279 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.080) 0:01:47.360 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.054) 0:01:47.414 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:17:09 -0500 (0:00:00.057) 0:01:47.471 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:17:10 -0500 (0:00:00.070) 0:01:47.542 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:17:10 -0500 (0:00:00.144) 0:01:47.687 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:17:11 -0500 (0:00:01.348) 0:01:49.035 **** ok: [managed-node3] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:17:11 -0500 (0:00:00.075) 0:01:49.110 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:17:11 -0500 (0:00:00.072) 0:01:49.183 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:17:15 -0500 (0:00:04.014) 0:01:53.197 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:17:15 -0500 (0:00:00.105) 0:01:53.303 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:17:15 -0500 (0:00:00.040) 0:01:53.344 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:17:15 -0500 (0:00:00.041) 0:01:53.385 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:17:15 -0500 (0:00:00.041) 0:01:53.427 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:17:16 -0500 (0:00:00.815) 0:01:54.242 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service": { "name": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:17:17 -0500 (0:00:01.236) 0:01:55.479 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:17:18 -0500 (0:00:00.167) 0:01:55.646 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d7eec0a51\x2d6822\x2d4ebd\x2da2f8\x2d2ec54b2ec255.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "name": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-readahead-replay.service cryptsetup-pre.target systemd-journald.socket dev-sda.device system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-7eec0a51-6822-4ebd-a2f8-2ec54b2ec255 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:17:18 -0500 (0:00:00.645) 0:01:56.292 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:17:22 -0500 (0:00:03.957) 0:02:00.249 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:17:22 -0500 (0:00:00.085) 0:02:00.335 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d7eec0a51\x2d6822\x2d4ebd\x2da2f8\x2d2ec54b2ec255.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "name": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d7eec0a51\\x2d6822\\x2d4ebd\\x2da2f8\\x2d2ec54b2ec255.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:17:23 -0500 (0:00:00.597) 0:02:00.932 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:17:23 -0500 (0:00:00.063) 0:02:00.995 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:17:23 -0500 (0:00:00.079) 0:02:01.075 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 11 December 2024 15:17:23 -0500 (0:00:00.060) 0:02:01.136 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948228.7554064, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948228.7554064, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1733948228.7554064, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "3639750", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 11 December 2024 15:17:24 -0500 (0:00:00.741) 0:02:01.878 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:158 Wednesday 11 December 2024 15:17:24 -0500 (0:00:00.075) 0:02:01.954 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:17:24 -0500 (0:00:00.197) 0:02:02.151 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:17:24 -0500 (0:00:00.103) 0:02:02.255 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:17:24 -0500 (0:00:00.061) 0:02:02.316 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:17:24 -0500 (0:00:00.175) 0:02:02.492 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:17:25 -0500 (0:00:00.095) 0:02:02.587 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:17:25 -0500 (0:00:00.078) 0:02:02.665 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:17:25 -0500 (0:00:00.054) 0:02:02.720 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:17:25 -0500 (0:00:00.050) 0:02:02.771 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:17:25 -0500 (0:00:00.094) 0:02:02.865 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:17:26 -0500 (0:00:01.083) 0:02:03.949 **** ok: [managed-node3] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:17:26 -0500 (0:00:00.039) 0:02:03.988 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:17:26 -0500 (0:00:00.050) 0:02:04.038 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:17:30 -0500 (0:00:03.910) 0:02:07.948 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:17:30 -0500 (0:00:00.104) 0:02:08.053 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:17:30 -0500 (0:00:00.046) 0:02:08.099 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:17:30 -0500 (0:00:00.045) 0:02:08.145 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:17:30 -0500 (0:00:00.042) 0:02:08.187 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:17:31 -0500 (0:00:00.678) 0:02:08.866 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:17:32 -0500 (0:00:01.217) 0:02:10.083 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:17:32 -0500 (0:00:00.056) 0:02:10.140 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:17:32 -0500 (0:00:00.033) 0:02:10.174 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eac68494-576f-4376-a0f8-295dff8524d7", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:17:42 -0500 (0:00:10.229) 0:02:20.403 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:17:42 -0500 (0:00:00.065) 0:02:20.469 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948214.051391, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9740bbc9b4dfb4ae7c8c3ba9d0103e52eeba1551", "ctime": 1733948214.0483909, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948214.0483909, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:17:43 -0500 (0:00:00.498) 0:02:20.967 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:17:43 -0500 (0:00:00.444) 0:02:21.412 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:17:43 -0500 (0:00:00.051) 0:02:21.463 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eac68494-576f-4376-a0f8-295dff8524d7", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:17:44 -0500 (0:00:00.123) 0:02:21.587 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:17:44 -0500 (0:00:00.066) 0:02:21.653 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:17:44 -0500 (0:00:00.067) 0:02:21.721 **** changed: [managed-node3] => (item={u'src': u'UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d741ed8a-cf73-4083-ab20-2c91b77b7f8c" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:17:44 -0500 (0:00:00.604) 0:02:22.325 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:17:45 -0500 (0:00:00.802) 0:02:23.128 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:17:46 -0500 (0:00:00.491) 0:02:23.619 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:17:46 -0500 (0:00:00.103) 0:02:23.722 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:17:47 -0500 (0:00:00.797) 0:02:24.520 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948218.5273955, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948215.7563927, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263814, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1733948215.7553926, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071595675227", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:17:47 -0500 (0:00:00.664) 0:02:25.184 **** changed: [managed-node3] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-eac68494-576f-4376-a0f8-295dff8524d7', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-eac68494-576f-4376-a0f8-295dff8524d7", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:17:48 -0500 (0:00:00.677) 0:02:25.862 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:171 Wednesday 11 December 2024 15:17:49 -0500 (0:00:00.874) 0:02:26.737 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:17:49 -0500 (0:00:00.131) 0:02:26.868 **** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:17:49 -0500 (0:00:00.056) 0:02:26.925 **** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:17:49 -0500 (0:00:00.075) 0:02:27.000 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "size": "10G", "type": "crypt", "uuid": "091311f9-efe9-42d0-949d-be43fb410efc" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eac68494-576f-4376-a0f8-295dff8524d7" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:17:49 -0500 (0:00:00.413) 0:02:27.414 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003007", "end": "2024-12-11 15:17:50.204507", "rc": 0, "start": "2024-12-11 15:17:50.201500" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:17:50 -0500 (0:00:00.399) 0:02:27.814 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002953", "end": "2024-12-11 15:17:50.609880", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:17:50.606927" } STDOUT: luks-eac68494-576f-4376-a0f8-295dff8524d7 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:17:50 -0500 (0:00:00.404) 0:02:28.218 **** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:17:50 -0500 (0:00:00.053) 0:02:28.272 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:17:50 -0500 (0:00:00.110) 0:02:28.383 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:17:50 -0500 (0:00:00.055) 0:02:28.439 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.187) 0:02:28.627 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.054) 0:02:28.682 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.070) 0:02:28.752 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.140) 0:02:28.893 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.070) 0:02:28.964 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.059) 0:02:29.023 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.067) 0:02:29.091 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.065) 0:02:29.157 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.057) 0:02:29.215 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.061) 0:02:29.277 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.065) 0:02:29.342 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.049) 0:02:29.392 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:17:51 -0500 (0:00:00.082) 0:02:29.474 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.058) 0:02:29.533 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.045) 0:02:29.578 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.040) 0:02:29.619 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.043) 0:02:29.663 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.038) 0:02:29.701 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.050) 0:02:29.752 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.053) 0:02:29.806 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948262.6244419, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948262.6244419, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28762, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948262.6244419, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.427) 0:02:30.234 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.092) 0:02:30.326 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.097) 0:02:30.424 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:17:52 -0500 (0:00:00.085) 0:02:30.510 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:17:53 -0500 (0:00:00.059) 0:02:30.569 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:17:53 -0500 (0:00:00.065) 0:02:30.635 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:17:53 -0500 (0:00:00.103) 0:02:30.739 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948262.7524421, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948262.7524421, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 403405, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948262.7524421, "nlink": 1, "path": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:17:53 -0500 (0:00:00.476) 0:02:31.215 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:17:54 -0500 (0:00:00.793) 0:02:32.009 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.027098", "end": "2024-12-11 15:17:54.915192", "rc": 0, "start": "2024-12-11 15:17:54.888094" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 21 71 e3 c2 dd 94 64 5d af 8d 73 e5 d2 5a 6a dc 78 34 a4 9b MK salt: c5 d6 89 2d 75 88 33 59 1c e4 d9 00 17 fd 7f 8b 4b 56 03 3e bf de e4 d8 4b 4b a5 d5 0a 35 60 0d MK iterations: 23141 UUID: eac68494-576f-4376-a0f8-295dff8524d7 Key Slot 0: ENABLED Iterations: 369216 Salt: af 85 9b eb ac d8 00 ba 9d b7 e4 f0 2d 6f 5b 2c c3 91 71 19 8d 84 d0 71 73 d2 22 1d 26 be 7d a9 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.534) 0:02:32.543 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.082) 0:02:32.626 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.102) 0:02:32.729 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.076) 0:02:32.805 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.079) 0:02:32.885 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.054) 0:02:32.939 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.049) 0:02:32.988 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.050) 0:02:33.039 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-eac68494-576f-4376-a0f8-295dff8524d7 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.055) 0:02:33.094 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.047) 0:02:33.142 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.049) 0:02:33.192 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.062) 0:02:33.255 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.076) 0:02:33.331 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.084) 0:02:33.416 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:17:55 -0500 (0:00:00.098) 0:02:33.514 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.061) 0:02:33.576 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.056) 0:02:33.633 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.055) 0:02:33.688 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.068) 0:02:33.757 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.098) 0:02:33.856 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.088) 0:02:33.944 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.074) 0:02:34.019 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.070) 0:02:34.089 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.058) 0:02:34.147 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.065) 0:02:34.213 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.077) 0:02:34.290 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.063) 0:02:34.354 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.060) 0:02:34.414 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:17:56 -0500 (0:00:00.101) 0:02:34.515 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.114) 0:02:34.630 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.068) 0:02:34.698 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.082) 0:02:34.781 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.077) 0:02:34.858 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.058) 0:02:34.917 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.064) 0:02:34.981 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.061) 0:02:35.043 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.085) 0:02:35.128 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.063) 0:02:35.192 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.061) 0:02:35.253 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.056) 0:02:35.310 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.044) 0:02:35.355 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.036) 0:02:35.391 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:17:57 -0500 (0:00:00.037) 0:02:35.429 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.161) 0:02:35.591 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.057) 0:02:35.648 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.121) 0:02:35.769 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.085) 0:02:35.855 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.088) 0:02:35.944 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.066) 0:02:36.010 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.070) 0:02:36.081 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.066) 0:02:36.148 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.052) 0:02:36.200 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.059) 0:02:36.260 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.083) 0:02:36.344 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:17:58 -0500 (0:00:00.111) 0:02:36.455 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.101) 0:02:36.557 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.120) 0:02:36.677 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.094) 0:02:36.771 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.077) 0:02:36.849 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:178 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.068) 0:02:36.917 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.171) 0:02:37.089 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.079) 0:02:37.169 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.082) 0:02:37.251 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.116) 0:02:37.368 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:17:59 -0500 (0:00:00.068) 0:02:37.436 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:18:00 -0500 (0:00:00.191) 0:02:37.628 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:18:00 -0500 (0:00:00.099) 0:02:37.727 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:18:00 -0500 (0:00:00.106) 0:02:37.833 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:18:00 -0500 (0:00:00.109) 0:02:37.943 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:18:00 -0500 (0:00:00.097) 0:02:38.041 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:18:00 -0500 (0:00:00.188) 0:02:38.229 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:18:02 -0500 (0:00:01.375) 0:02:39.605 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:18:02 -0500 (0:00:00.088) 0:02:39.693 **** ok: [managed-node3] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:18:02 -0500 (0:00:00.096) 0:02:39.790 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:18:06 -0500 (0:00:04.196) 0:02:43.986 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:18:06 -0500 (0:00:00.179) 0:02:44.166 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:18:06 -0500 (0:00:00.050) 0:02:44.217 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:18:06 -0500 (0:00:00.073) 0:02:44.290 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:18:06 -0500 (0:00:00.061) 0:02:44.352 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:18:07 -0500 (0:00:00.854) 0:02:45.206 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:18:08 -0500 (0:00:01.301) 0:02:46.508 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:18:09 -0500 (0:00:00.182) 0:02:46.691 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:18:09 -0500 (0:00:00.065) 0:02:46.756 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:18:13 -0500 (0:00:04.306) 0:02:51.063 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:18:13 -0500 (0:00:00.075) 0:02:51.139 **** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:18:13 -0500 (0:00:00.049) 0:02:51.189 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:18:13 -0500 (0:00:00.061) 0:02:51.250 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:18:13 -0500 (0:00:00.074) 0:02:51.325 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Wednesday 11 December 2024 15:18:13 -0500 (0:00:00.051) 0:02:51.377 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.211) 0:02:51.588 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.088) 0:02:51.676 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.058) 0:02:51.735 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.136) 0:02:51.872 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.058) 0:02:51.930 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.054) 0:02:51.984 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.050) 0:02:52.035 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.043) 0:02:52.079 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:18:14 -0500 (0:00:00.104) 0:02:52.184 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:18:15 -0500 (0:00:01.130) 0:02:53.314 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:18:15 -0500 (0:00:00.108) 0:02:53.422 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:18:15 -0500 (0:00:00.047) 0:02:53.470 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:18:20 -0500 (0:00:04.192) 0:02:57.663 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:18:20 -0500 (0:00:00.102) 0:02:57.766 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:18:20 -0500 (0:00:00.051) 0:02:57.817 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:18:20 -0500 (0:00:00.070) 0:02:57.888 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:18:20 -0500 (0:00:00.047) 0:02:57.935 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:18:21 -0500 (0:00:00.752) 0:02:58.687 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:18:22 -0500 (0:00:01.121) 0:02:59.809 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:18:22 -0500 (0:00:00.116) 0:02:59.925 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:18:22 -0500 (0:00:00.068) 0:02:59.994 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eac68494-576f-4376-a0f8-295dff8524d7", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-f579724d-5e02-41d4-bc26-06655db2ab31", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:18:33 -0500 (0:00:10.801) 0:03:10.796 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:18:33 -0500 (0:00:00.078) 0:03:10.874 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948265.9874454, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "7cccf6ad08f76785c792aa0f21579a4edf4b4151", "ctime": 1733948265.9834454, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948265.9834454, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:18:34 -0500 (0:00:00.661) 0:03:11.535 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:18:34 -0500 (0:00:00.407) 0:03:11.943 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:18:34 -0500 (0:00:00.068) 0:03:12.012 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-eac68494-576f-4376-a0f8-295dff8524d7", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-f579724d-5e02-41d4-bc26-06655db2ab31", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:18:34 -0500 (0:00:00.079) 0:03:12.091 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:18:34 -0500 (0:00:00.142) 0:03:12.233 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:18:34 -0500 (0:00:00.086) 0:03:12.320 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eac68494-576f-4376-a0f8-295dff8524d7" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:18:35 -0500 (0:00:00.445) 0:03:12.765 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:18:35 -0500 (0:00:00.606) 0:03:13.371 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:18:36 -0500 (0:00:00.544) 0:03:13.916 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:18:36 -0500 (0:00:00.061) 0:03:13.978 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:18:37 -0500 (0:00:00.593) 0:03:14.572 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948270.6084502, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "6b8378f869d63cd72ba8eb4d98a801730fd968ea", "ctime": 1733948268.1864476, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263817, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948268.1864476, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "18446744071595675387", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:18:37 -0500 (0:00:00.648) 0:03:15.221 **** changed: [managed-node3] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-eac68494-576f-4376-a0f8-295dff8524d7', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-eac68494-576f-4376-a0f8-295dff8524d7", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node3] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-f579724d-5e02-41d4-bc26-06655db2ab31", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:18:38 -0500 (0:00:00.780) 0:03:16.001 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:214 Wednesday 11 December 2024 15:18:39 -0500 (0:00:00.922) 0:03:16.924 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:18:39 -0500 (0:00:00.219) 0:03:17.143 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:18:39 -0500 (0:00:00.061) 0:03:17.204 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:18:39 -0500 (0:00:00.041) 0:03:17.246 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "size": "10G", "type": "crypt", "uuid": "2d064acb-8772-4602-9a07-8df231cf4603" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "f579724d-5e02-41d4-bc26-06655db2ab31" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:18:40 -0500 (0:00:00.393) 0:03:17.639 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002961", "end": "2024-12-11 15:18:40.404171", "rc": 0, "start": "2024-12-11 15:18:40.401210" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:18:40 -0500 (0:00:00.388) 0:03:18.028 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002895", "end": "2024-12-11 15:18:40.889435", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:18:40.886540" } STDOUT: luks-f579724d-5e02-41d4-bc26-06655db2ab31 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:18:40 -0500 (0:00:00.467) 0:03:18.495 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.126) 0:03:18.622 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.085) 0:03:18.707 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.124) 0:03:18.832 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.097) 0:03:18.930 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.184) 0:03:19.115 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.077) 0:03:19.193 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.079) 0:03:19.272 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.062) 0:03:19.334 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.055) 0:03:19.390 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:18:41 -0500 (0:00:00.070) 0:03:19.460 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.083) 0:03:19.544 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.080) 0:03:19.624 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.103) 0:03:19.727 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.071) 0:03:19.799 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.370) 0:03:20.170 **** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.061) 0:03:20.231 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.126) 0:03:20.358 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.058) 0:03:20.416 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:18:42 -0500 (0:00:00.059) 0:03:20.475 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.052) 0:03:20.528 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.043) 0:03:20.572 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.046) 0:03:20.618 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.048) 0:03:20.667 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.045) 0:03:20.712 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.038) 0:03:20.751 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.036) 0:03:20.788 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.043) 0:03:20.831 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.054) 0:03:20.886 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.125) 0:03:21.011 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.082) 0:03:21.094 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.128) 0:03:21.222 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.082) 0:03:21.305 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.131) 0:03:21.436 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:18:43 -0500 (0:00:00.061) 0:03:21.498 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.057) 0:03:21.555 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.059) 0:03:21.615 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.058) 0:03:21.674 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.198) 0:03:21.872 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.078) 0:03:21.950 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.147) 0:03:22.098 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.056) 0:03:22.154 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.054) 0:03:22.209 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.056) 0:03:22.266 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.062) 0:03:22.329 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.061) 0:03:22.390 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.061) 0:03:22.452 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:18:44 -0500 (0:00:00.056) 0:03:22.508 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.111) 0:03:22.619 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.063) 0:03:22.683 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.269) 0:03:22.953 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.054) 0:03:23.007 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.056) 0:03:23.064 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.042) 0:03:23.107 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.048) 0:03:23.155 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.040) 0:03:23.196 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.037) 0:03:23.234 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.039) 0:03:23.273 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.038) 0:03:23.312 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.037) 0:03:23.349 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.036) 0:03:23.386 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.053) 0:03:23.439 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:18:45 -0500 (0:00:00.065) 0:03:23.505 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.045) 0:03:23.550 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.044) 0:03:23.595 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.046) 0:03:23.642 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.050) 0:03:23.692 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.040) 0:03:23.732 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.049) 0:03:23.782 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.056) 0:03:23.839 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948312.9874947, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948312.9874947, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 412668, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948312.9874947, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.350) 0:03:24.189 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.046) 0:03:24.236 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.035) 0:03:24.271 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.045) 0:03:24.317 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.040) 0:03:24.358 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.039) 0:03:24.398 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:18:46 -0500 (0:00:00.050) 0:03:24.448 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948313.126495, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948313.126495, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 413737, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948313.126495, "nlink": 1, "path": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:18:47 -0500 (0:00:00.341) 0:03:24.790 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:18:47 -0500 (0:00:00.628) 0:03:25.418 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.027032", "end": "2024-12-11 15:18:48.204614", "rc": 0, "start": "2024-12-11 15:18:48.177582" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: c8 1e 69 a8 dc 34 f5 91 43 a4 cd 4c e3 43 bd 1e a2 3e f0 f6 MK salt: 95 6b 6c 66 31 19 56 a8 88 cd 65 f0 bb ec e1 e6 a0 ed 0d 73 d6 c3 b9 53 c4 24 1e 26 db 49 a4 48 MK iterations: 22787 UUID: f579724d-5e02-41d4-bc26-06655db2ab31 Key Slot 0: ENABLED Iterations: 366634 Salt: cb 9b f5 97 df ae a3 a3 2d 0b 21 b2 5f fd 46 4e 67 73 16 1b ef 6d 5f 0f 0a 00 55 fa ec 2c 26 be Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.369) 0:03:25.788 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.049) 0:03:25.837 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.048) 0:03:25.886 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.047) 0:03:25.934 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.046) 0:03:25.980 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.040) 0:03:26.021 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.040) 0:03:26.062 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.040) 0:03:26.102 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f579724d-5e02-41d4-bc26-06655db2ab31 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.046) 0:03:26.149 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.043) 0:03:26.192 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.047) 0:03:26.239 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.046) 0:03:26.286 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.049) 0:03:26.335 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.037) 0:03:26.373 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.037) 0:03:26.410 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.037) 0:03:26.448 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:18:48 -0500 (0:00:00.036) 0:03:26.484 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.052) 0:03:26.537 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:26.576 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:26.616 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.037) 0:03:26.653 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.037) 0:03:26.691 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.037) 0:03:26.729 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.035) 0:03:26.764 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.043) 0:03:26.807 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:26.847 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:26.887 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.041) 0:03:26.929 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:26.968 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.038) 0:03:27.006 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.042) 0:03:27.049 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:27.089 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.040) 0:03:27.129 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.036) 0:03:27.166 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.037) 0:03:27.204 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.038) 0:03:27.242 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.039) 0:03:27.281 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.038) 0:03:27.319 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.037) 0:03:27.357 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.036) 0:03:27.393 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.036) 0:03:27.430 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.035) 0:03:27.466 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:18:49 -0500 (0:00:00.040) 0:03:27.506 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.042) 0:03:27.549 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.043) 0:03:27.593 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.038) 0:03:27.632 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.035) 0:03:27.667 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.037) 0:03:27.705 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.044) 0:03:27.750 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.041) 0:03:27.791 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.041) 0:03:27.833 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.043) 0:03:27.877 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.049) 0:03:27.926 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.054) 0:03:27.980 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.059) 0:03:28.040 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.044) 0:03:28.085 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.051) 0:03:28.136 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.044) 0:03:28.181 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.042) 0:03:28.223 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.032) 0:03:28.256 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 11 December 2024 15:18:50 -0500 (0:00:00.040) 0:03:28.296 **** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:220 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.360) 0:03:28.656 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.129) 0:03:28.786 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.063) 0:03:28.850 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.161) 0:03:29.011 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.083) 0:03:29.094 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.057) 0:03:29.151 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.104) 0:03:29.256 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.040) 0:03:29.296 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.036) 0:03:29.332 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.050) 0:03:29.383 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:18:51 -0500 (0:00:00.050) 0:03:29.434 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:18:52 -0500 (0:00:00.113) 0:03:29.548 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:18:53 -0500 (0:00:01.206) 0:03:30.754 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:18:53 -0500 (0:00:00.068) 0:03:30.823 **** ok: [managed-node3] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:18:53 -0500 (0:00:00.052) 0:03:30.875 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:18:57 -0500 (0:00:03.845) 0:03:34.721 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:18:57 -0500 (0:00:00.128) 0:03:34.850 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:18:57 -0500 (0:00:00.054) 0:03:34.904 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:18:57 -0500 (0:00:00.061) 0:03:34.966 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:18:57 -0500 (0:00:00.054) 0:03:35.020 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:18:58 -0500 (0:00:00.756) 0:03:35.776 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service": { "name": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:18:59 -0500 (0:00:01.151) 0:03:36.928 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:18:59 -0500 (0:00:00.153) 0:03:37.082 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2deac68494\x2d576f\x2d4376\x2da0f8\x2d295dff8524d7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "name": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-replay.service systemd-journald.socket system-systemd\\x2dcryptsetup.slice dev-sda.device systemd-readahead-collect.service cryptsetup-pre.target", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-eac68494-576f-4376-a0f8-295dff8524d7", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-eac68494-576f-4376-a0f8-295dff8524d7 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-eac68494-576f-4376-a0f8-295dff8524d7 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:19:00 -0500 (0:00:00.643) 0:03:37.726 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-f579724d-5e02-41d4-bc26-06655db2ab31' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:19:04 -0500 (0:00:04.107) 0:03:41.833 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-f579724d-5e02-41d4-bc26-06655db2ab31' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:19:04 -0500 (0:00:00.056) 0:03:41.890 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2deac68494\x2d576f\x2d4376\x2da0f8\x2d295dff8524d7.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "name": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deac68494\\x2d576f\\x2d4376\\x2da0f8\\x2d295dff8524d7.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:19:04 -0500 (0:00:00.536) 0:03:42.427 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:19:04 -0500 (0:00:00.057) 0:03:42.485 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.061) 0:03:42.547 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.040) 0:03:42.587 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948331.0735137, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948331.0735137, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1733948331.0735137, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "1589485278", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.334) 0:03:42.922 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:244 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.041) 0:03:42.963 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.214) 0:03:43.177 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.059) 0:03:43.237 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.056) 0:03:43.293 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.127) 0:03:43.421 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:19:05 -0500 (0:00:00.056) 0:03:43.478 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:19:06 -0500 (0:00:00.060) 0:03:43.538 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:19:06 -0500 (0:00:00.051) 0:03:43.589 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:19:06 -0500 (0:00:00.046) 0:03:43.635 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:19:06 -0500 (0:00:00.106) 0:03:43.742 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:19:07 -0500 (0:00:01.226) 0:03:44.968 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:19:07 -0500 (0:00:00.046) 0:03:45.014 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:19:07 -0500 (0:00:00.041) 0:03:45.056 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:19:11 -0500 (0:00:04.027) 0:03:49.083 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:19:11 -0500 (0:00:00.066) 0:03:49.150 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:19:11 -0500 (0:00:00.033) 0:03:49.184 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:19:11 -0500 (0:00:00.037) 0:03:49.222 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:19:11 -0500 (0:00:00.036) 0:03:49.258 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:19:12 -0500 (0:00:00.643) 0:03:49.901 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service": { "name": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:19:13 -0500 (0:00:00.973) 0:03:50.875 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:19:13 -0500 (0:00:00.056) 0:03:50.931 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2df579724d\x2d5e02\x2d41d4\x2dbc26\x2d06655db2ab31.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "name": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-journald.socket cryptsetup-pre.target systemd-readahead-replay.service dev-sda1.device systemd-readahead-collect.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-f579724d-5e02-41d4-bc26-06655db2ab31", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f579724d-5e02-41d4-bc26-06655db2ab31 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f579724d-5e02-41d4-bc26-06655db2ab31 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:19:13 -0500 (0:00:00.508) 0:03:51.440 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-f579724d-5e02-41d4-bc26-06655db2ab31", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:19:18 -0500 (0:00:04.410) 0:03:55.851 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:19:18 -0500 (0:00:00.039) 0:03:55.891 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948316.2844982, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "cf3ec65a65d88b411a740dd3de0826b8c647fd03", "ctime": 1733948316.2814982, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948316.2814982, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:19:18 -0500 (0:00:00.390) 0:03:56.281 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:19:19 -0500 (0:00:00.380) 0:03:56.662 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2df579724d\x2d5e02\x2d41d4\x2dbc26\x2d06655db2ab31.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "name": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:19:19 -0500 (0:00:00.558) 0:03:57.220 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-f579724d-5e02-41d4-bc26-06655db2ab31", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:19:19 -0500 (0:00:00.052) 0:03:57.273 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:19:19 -0500 (0:00:00.050) 0:03:57.324 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:19:19 -0500 (0:00:00.044) 0:03:57.368 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f579724d-5e02-41d4-bc26-06655db2ab31" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:19:20 -0500 (0:00:00.380) 0:03:57.749 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:19:20 -0500 (0:00:00.542) 0:03:58.291 **** changed: [managed-node3] => (item={u'src': u'UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:19:21 -0500 (0:00:00.543) 0:03:58.835 **** skipping: [managed-node3] => (item={u'src': u'UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:19:21 -0500 (0:00:00.085) 0:03:58.920 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:19:21 -0500 (0:00:00.512) 0:03:59.432 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948320.888503, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1fcb1a6ddcf1839d37edc37f13355d27b59c8b73", "ctime": 1733948318.3915005, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263817, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948318.3915005, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "18446744071595675541", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:19:22 -0500 (0:00:00.504) 0:03:59.937 **** changed: [managed-node3] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-f579724d-5e02-41d4-bc26-06655db2ab31', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-f579724d-5e02-41d4-bc26-06655db2ab31", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:19:22 -0500 (0:00:00.434) 0:04:00.371 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:261 Wednesday 11 December 2024 15:19:23 -0500 (0:00:00.759) 0:04:01.130 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:19:23 -0500 (0:00:00.152) 0:04:01.283 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:19:23 -0500 (0:00:00.078) 0:04:01.361 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:19:23 -0500 (0:00:00.084) 0:04:01.446 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "f3a909e8-9d96-4537-b004-5ac4c37c141e" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:19:24 -0500 (0:00:00.694) 0:04:02.140 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003217", "end": "2024-12-11 15:19:24.930942", "rc": 0, "start": "2024-12-11 15:19:24.927725" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.401) 0:04:02.542 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002863", "end": "2024-12-11 15:19:25.350742", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:19:25.347879" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.415) 0:04:02.957 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.114) 0:04:03.072 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.041) 0:04:03.113 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.039) 0:04:03.152 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.047) 0:04:03.200 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.133) 0:04:03.334 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.078) 0:04:03.412 **** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:19:25 -0500 (0:00:00.078) 0:04:03.490 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.101) 0:04:03.592 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.101) 0:04:03.693 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.087) 0:04:03.781 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.096) 0:04:03.877 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.094) 0:04:03.971 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.116) 0:04:04.088 **** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:19:26 -0500 (0:00:00.079) 0:04:04.168 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.395) 0:04:04.563 **** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.052) 0:04:04.616 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.123) 0:04:04.739 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.060) 0:04:04.800 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.061) 0:04:04.862 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.053) 0:04:04.915 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.055) 0:04:04.971 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.056) 0:04:05.028 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.081) 0:04:05.109 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.056) 0:04:05.166 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.056) 0:04:05.222 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.054) 0:04:05.277 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.055) 0:04:05.333 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:19:27 -0500 (0:00:00.054) 0:04:05.387 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.188) 0:04:05.575 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.077) 0:04:05.653 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.118) 0:04:05.771 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.079) 0:04:05.850 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.134) 0:04:05.985 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.062) 0:04:06.048 **** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.048) 0:04:06.096 **** TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.065) 0:04:06.162 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.063) 0:04:06.225 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.132) 0:04:06.358 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:19:28 -0500 (0:00:00.073) 0:04:06.431 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.147) 0:04:06.579 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.056) 0:04:06.635 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.054) 0:04:06.690 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.055) 0:04:06.746 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.054) 0:04:06.801 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.059) 0:04:06.860 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.057) 0:04:06.918 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.057) 0:04:06.975 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.113) 0:04:07.089 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:19:29 -0500 (0:00:00.065) 0:04:07.154 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.409) 0:04:07.563 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.085) 0:04:07.649 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.069) 0:04:07.718 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.070) 0:04:07.789 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.097) 0:04:07.886 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.099) 0:04:07.985 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.117) 0:04:08.102 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.112) 0:04:08.215 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.121) 0:04:08.337 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.115) 0:04:08.452 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:19:30 -0500 (0:00:00.060) 0:04:08.513 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.118) 0:04:08.631 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.099) 0:04:08.731 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.069) 0:04:08.800 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.058) 0:04:08.859 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.054) 0:04:08.914 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.045) 0:04:08.960 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.040) 0:04:09.000 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.051) 0:04:09.051 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.055) 0:04:09.107 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948358.2125423, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948358.2125423, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 424250, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948358.2125423, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.341) 0:04:09.448 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:19:31 -0500 (0:00:00.067) 0:04:09.516 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:19:32 -0500 (0:00:00.054) 0:04:09.570 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:19:32 -0500 (0:00:00.068) 0:04:09.639 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:19:32 -0500 (0:00:00.060) 0:04:09.699 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:19:32 -0500 (0:00:00.054) 0:04:09.753 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:19:32 -0500 (0:00:00.068) 0:04:09.822 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:19:32 -0500 (0:00:00.133) 0:04:09.955 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.737) 0:04:10.692 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.055) 0:04:10.748 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.064) 0:04:10.812 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.070) 0:04:10.883 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.058) 0:04:10.941 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.073) 0:04:11.014 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.054) 0:04:11.069 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.055) 0:04:11.125 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.055) 0:04:11.181 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.069) 0:04:11.250 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.065) 0:04:11.316 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.059) 0:04:11.375 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.076) 0:04:11.452 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:19:33 -0500 (0:00:00.065) 0:04:11.517 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.056) 0:04:11.574 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.057) 0:04:11.632 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.053) 0:04:11.685 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.059) 0:04:11.745 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.054) 0:04:11.800 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.056) 0:04:11.856 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.061) 0:04:11.918 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.057) 0:04:11.975 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.060) 0:04:12.036 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.060) 0:04:12.096 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.057) 0:04:12.154 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.063) 0:04:12.218 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.059) 0:04:12.277 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.059) 0:04:12.336 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.060) 0:04:12.396 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:19:34 -0500 (0:00:00.074) 0:04:12.471 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.071) 0:04:12.542 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.058) 0:04:12.601 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.061) 0:04:12.662 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.066) 0:04:12.728 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.138) 0:04:12.867 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.079) 0:04:12.947 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.064) 0:04:13.012 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.061) 0:04:13.073 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.053) 0:04:13.127 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.066) 0:04:13.194 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.054) 0:04:13.248 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.074) 0:04:13.323 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.054) 0:04:13.377 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.057) 0:04:13.435 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:19:35 -0500 (0:00:00.067) 0:04:13.502 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.105) 0:04:13.608 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.087) 0:04:13.695 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.073) 0:04:13.769 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.061) 0:04:13.831 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.070) 0:04:13.902 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.068) 0:04:13.971 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.061) 0:04:14.032 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.058) 0:04:14.090 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.074) 0:04:14.164 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.093) 0:04:14.258 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.083) 0:04:14.341 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.066) 0:04:14.408 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:19:36 -0500 (0:00:00.056) 0:04:14.465 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.062) 0:04:14.527 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.063) 0:04:14.590 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.128) 0:04:14.719 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.048) 0:04:14.768 **** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:267 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.376) 0:04:15.144 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.127) 0:04:15.271 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.057) 0:04:15.329 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.058) 0:04:15.387 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.061) 0:04:15.449 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:19:37 -0500 (0:00:00.061) 0:04:15.511 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:19:38 -0500 (0:00:00.127) 0:04:15.638 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:19:38 -0500 (0:00:00.051) 0:04:15.690 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:19:38 -0500 (0:00:00.054) 0:04:15.745 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:19:38 -0500 (0:00:00.047) 0:04:15.793 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:19:38 -0500 (0:00:00.044) 0:04:15.837 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:19:38 -0500 (0:00:00.088) 0:04:15.926 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:19:39 -0500 (0:00:01.121) 0:04:17.048 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:19:39 -0500 (0:00:00.057) 0:04:17.105 **** ok: [managed-node3] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:19:39 -0500 (0:00:00.066) 0:04:17.172 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:19:43 -0500 (0:00:03.835) 0:04:21.007 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:19:43 -0500 (0:00:00.106) 0:04:21.113 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:19:43 -0500 (0:00:00.045) 0:04:21.158 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:19:43 -0500 (0:00:00.054) 0:04:21.212 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:19:43 -0500 (0:00:00.049) 0:04:21.262 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:19:44 -0500 (0:00:00.751) 0:04:22.013 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service": { "name": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:19:45 -0500 (0:00:01.085) 0:04:23.099 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:19:45 -0500 (0:00:00.082) 0:04:23.181 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2df579724d\x2d5e02\x2d41d4\x2dbc26\x2d06655db2ab31.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "name": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service systemd-journald.socket dev-sda1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-f579724d-5e02-41d4-bc26-06655db2ab31", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f579724d-5e02-41d4-bc26-06655db2ab31 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f579724d-5e02-41d4-bc26-06655db2ab31 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:19:46 -0500 (0:00:00.596) 0:04:23.778 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:19:50 -0500 (0:00:04.057) 0:04:27.836 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:19:50 -0500 (0:00:00.112) 0:04:27.949 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2df579724d\x2d5e02\x2d41d4\x2dbc26\x2d06655db2ab31.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "name": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2df579724d\\x2d5e02\\x2d41d4\\x2dbc26\\x2d06655db2ab31.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:19:50 -0500 (0:00:00.511) 0:04:28.460 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:19:50 -0500 (0:00:00.044) 0:04:28.505 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:19:51 -0500 (0:00:00.051) 0:04:28.557 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 11 December 2024 15:19:51 -0500 (0:00:00.044) 0:04:28.602 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948377.5365627, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948377.5365627, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1733948377.5365627, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "912652172", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 11 December 2024 15:19:51 -0500 (0:00:00.410) 0:04:29.012 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:293 Wednesday 11 December 2024 15:19:51 -0500 (0:00:00.053) 0:04:29.065 **** ok: [managed-node3] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testTa18zclukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:300 Wednesday 11 December 2024 15:19:51 -0500 (0:00:00.449) 0:04:29.515 **** ok: [managed-node3] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testTa18zclukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1733948392.05-18974-103187838687852/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:307 Wednesday 11 December 2024 15:19:52 -0500 (0:00:00.772) 0:04:30.288 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:19:52 -0500 (0:00:00.083) 0:04:30.371 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:19:52 -0500 (0:00:00.073) 0:04:30.445 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:19:52 -0500 (0:00:00.058) 0:04:30.503 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:19:53 -0500 (0:00:00.095) 0:04:30.599 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:19:53 -0500 (0:00:00.039) 0:04:30.638 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:19:53 -0500 (0:00:00.043) 0:04:30.681 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:19:53 -0500 (0:00:00.056) 0:04:30.737 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:19:53 -0500 (0:00:00.051) 0:04:30.789 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:19:53 -0500 (0:00:00.114) 0:04:30.904 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:19:54 -0500 (0:00:01.112) 0:04:32.016 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testTa18zclukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:19:54 -0500 (0:00:00.063) 0:04:32.080 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:19:54 -0500 (0:00:00.049) 0:04:32.130 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:19:58 -0500 (0:00:04.170) 0:04:36.300 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:19:58 -0500 (0:00:00.121) 0:04:36.422 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:19:58 -0500 (0:00:00.051) 0:04:36.474 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:19:59 -0500 (0:00:00.063) 0:04:36.537 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:19:59 -0500 (0:00:00.052) 0:04:36.590 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:19:59 -0500 (0:00:00.748) 0:04:37.339 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:20:00 -0500 (0:00:01.074) 0:04:38.413 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:20:00 -0500 (0:00:00.105) 0:04:38.518 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:20:01 -0500 (0:00:00.048) 0:04:38.566 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:20:11 -0500 (0:00:10.242) 0:04:48.809 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:20:11 -0500 (0:00:00.053) 0:04:48.862 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948361.1565454, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "4c01b4984bb30cf401542926f3236ff194eefc19", "ctime": 1733948361.1535454, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948361.1535454, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:20:11 -0500 (0:00:00.383) 0:04:49.246 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:20:12 -0500 (0:00:00.361) 0:04:49.607 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:20:12 -0500 (0:00:00.051) 0:04:49.658 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:20:12 -0500 (0:00:00.076) 0:04:49.735 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:20:12 -0500 (0:00:00.069) 0:04:49.812 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:20:12 -0500 (0:00:00.077) 0:04:49.890 **** changed: [managed-node3] => (item={u'src': u'UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=f3a909e8-9d96-4537-b004-5ac4c37c141e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:20:12 -0500 (0:00:00.448) 0:04:50.338 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:20:13 -0500 (0:00:00.551) 0:04:50.890 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:20:13 -0500 (0:00:00.470) 0:04:51.360 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:20:13 -0500 (0:00:00.071) 0:04:51.432 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:20:14 -0500 (0:00:00.544) 0:04:51.977 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948365.3495498, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948362.7475471, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263814, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1733948362.746547, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071595675727", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:20:14 -0500 (0:00:00.405) 0:04:52.382 **** changed: [managed-node3] => (item={u'state': u'present', u'password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'name': u'luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:20:15 -0500 (0:00:00.424) 0:04:52.807 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:324 Wednesday 11 December 2024 15:20:16 -0500 (0:00:00.907) 0:04:53.714 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:20:16 -0500 (0:00:00.100) 0:04:53.814 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:20:16 -0500 (0:00:00.076) 0:04:53.891 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:20:16 -0500 (0:00:00.052) 0:04:53.944 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "size": "10G", "type": "crypt", "uuid": "4255716a-3eac-4ca2-8183-1c97028072f9" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "4e842f38-e3a6-49fb-afb8-8a35bb31f6d1" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:20:16 -0500 (0:00:00.376) 0:04:54.321 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003551", "end": "2024-12-11 15:20:17.099228", "rc": 0, "start": "2024-12-11 15:20:17.095677" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:20:17 -0500 (0:00:00.387) 0:04:54.709 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002873", "end": "2024-12-11 15:20:17.508234", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:20:17.505361" } STDOUT: luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:20:17 -0500 (0:00:00.459) 0:04:55.169 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:20:17 -0500 (0:00:00.136) 0:04:55.305 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:20:17 -0500 (0:00:00.059) 0:04:55.364 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:20:17 -0500 (0:00:00.058) 0:04:55.422 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:20:17 -0500 (0:00:00.054) 0:04:55.476 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.127) 0:04:55.604 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.058) 0:04:55.662 **** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.053) 0:04:55.715 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.059) 0:04:55.775 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.060) 0:04:55.835 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.059) 0:04:55.895 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.057) 0:04:55.953 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.059) 0:04:56.013 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.058) 0:04:56.071 **** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.056) 0:04:56.127 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.301) 0:04:56.428 **** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:20:18 -0500 (0:00:00.034) 0:04:56.463 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.077) 0:04:56.541 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.037) 0:04:56.579 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.095) 0:04:56.674 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.039) 0:04:56.713 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.040) 0:04:56.753 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.040) 0:04:56.793 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.050) 0:04:56.844 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.046) 0:04:56.891 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.045) 0:04:56.936 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.037) 0:04:56.974 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.038) 0:04:57.012 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.044) 0:04:57.057 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.125) 0:04:57.183 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.085) 0:04:57.269 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.143) 0:04:57.412 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:20:19 -0500 (0:00:00.089) 0:04:57.501 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.203) 0:04:57.705 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.074) 0:04:57.780 **** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.063) 0:04:57.843 **** TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.057) 0:04:57.901 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.050) 0:04:57.952 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.089) 0:04:58.041 **** skipping: [managed-node3] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.062) 0:04:58.103 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.133) 0:04:58.237 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.047) 0:04:58.285 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.063) 0:04:58.348 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.043) 0:04:58.391 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.042) 0:04:58.433 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.037) 0:04:58.471 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:20:20 -0500 (0:00:00.044) 0:04:58.515 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.053) 0:04:58.568 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.104) 0:04:58.673 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.072) 0:04:58.745 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.258) 0:04:59.004 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.053) 0:04:59.058 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.057) 0:04:59.115 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.036) 0:04:59.152 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.049) 0:04:59.201 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.038) 0:04:59.240 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.036) 0:04:59.276 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.037) 0:04:59.314 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.055) 0:04:59.370 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.046) 0:04:59.416 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:20:21 -0500 (0:00:00.061) 0:04:59.478 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.048) 0:04:59.526 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.078) 0:04:59.605 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.054) 0:04:59.660 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.046) 0:04:59.706 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.036) 0:04:59.743 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.049) 0:04:59.792 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.142) 0:04:59.934 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.068) 0:05:00.002 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.065) 0:05:00.068 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948411.008597, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948411.008597, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 435379, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948411.008597, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.381) 0:05:00.449 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:20:22 -0500 (0:00:00.065) 0:05:00.515 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:20:23 -0500 (0:00:00.052) 0:05:00.568 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:20:23 -0500 (0:00:00.055) 0:05:00.624 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:20:23 -0500 (0:00:00.049) 0:05:00.673 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:20:23 -0500 (0:00:00.042) 0:05:00.715 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:20:23 -0500 (0:00:00.044) 0:05:00.760 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948411.132597, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948411.132597, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 434415, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948411.132597, "nlink": 1, "path": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:20:23 -0500 (0:00:00.370) 0:05:01.130 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.653) 0:05:01.784 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.026141", "end": "2024-12-11 15:20:24.573321", "rc": 0, "start": "2024-12-11 15:20:24.547180" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 52 74 76 a4 1c 57 2b f8 2a 4b 1e e0 fe 22 1d f5 cd eb 3b 5a MK salt: 26 e8 48 65 59 6a 02 2f 19 07 ce c1 ad 15 85 d8 e5 8c aa 24 f3 2a 7b 37 d9 1f 4f b9 d2 79 60 e3 MK iterations: 23272 UUID: 4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 Key Slot 0: ENABLED Iterations: 371308 Salt: 4c c9 ab f1 c8 aa 9e 3d f1 a7 82 57 32 31 9a 3d 33 e8 4a 42 1e 1a 34 e0 d8 5c ab 88 a3 c8 77 b0 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.384) 0:05:02.169 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.060) 0:05:02.229 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.060) 0:05:02.289 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.048) 0:05:02.338 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.049) 0:05:02.388 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:20:24 -0500 (0:00:00.051) 0:05:02.439 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.086) 0:05:02.525 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.051) 0:05:02.577 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.057) 0:05:02.634 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.055) 0:05:02.689 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.049) 0:05:02.738 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.049) 0:05:02.787 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.062) 0:05:02.850 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.058) 0:05:02.908 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.063) 0:05:02.971 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.059) 0:05:03.031 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.058) 0:05:03.089 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.059) 0:05:03.149 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.061) 0:05:03.210 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.058) 0:05:03.268 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.061) 0:05:03.329 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.058) 0:05:03.388 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.058) 0:05:03.447 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:20:25 -0500 (0:00:00.058) 0:05:03.505 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.081) 0:05:03.587 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.060) 0:05:03.647 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.063) 0:05:03.710 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.068) 0:05:03.778 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.062) 0:05:03.841 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.062) 0:05:03.904 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.067) 0:05:03.972 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.065) 0:05:04.037 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.063) 0:05:04.100 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.058) 0:05:04.159 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.057) 0:05:04.217 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.060) 0:05:04.277 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.061) 0:05:04.339 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.058) 0:05:04.398 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.055) 0:05:04.454 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:20:26 -0500 (0:00:00.057) 0:05:04.511 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.083) 0:05:04.594 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.097) 0:05:04.692 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.052) 0:05:04.744 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.113) 0:05:04.858 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.039) 0:05:04.898 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.038) 0:05:04.936 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.039) 0:05:04.976 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.052) 0:05:05.028 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.068) 0:05:05.096 **** ok: [managed-node3] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.074) 0:05:05.171 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.067) 0:05:05.238 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.065) 0:05:05.304 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.075) 0:05:05.379 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:20:27 -0500 (0:00:00.105) 0:05:05.485 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.101) 0:05:05.586 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.110) 0:05:05.697 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.115) 0:05:05.813 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.065) 0:05:05.879 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.071) 0:05:05.951 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.050) 0:05:06.001 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:327 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.056) 0:05:06.057 **** ok: [managed-node3] => { "changed": false, "path": "/tmp/storage_testTa18zclukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:337 Wednesday 11 December 2024 15:20:28 -0500 (0:00:00.427) 0:05:06.485 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.122) 0:05:06.608 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.068) 0:05:06.677 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.093) 0:05:06.771 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.097) 0:05:06.868 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.079) 0:05:06.948 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.142) 0:05:07.090 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.107) 0:05:07.197 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.105) 0:05:07.302 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.056) 0:05:07.359 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:20:29 -0500 (0:00:00.107) 0:05:07.466 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:20:30 -0500 (0:00:00.205) 0:05:07.672 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:20:31 -0500 (0:00:01.449) 0:05:09.121 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:20:31 -0500 (0:00:00.105) 0:05:09.227 **** ok: [managed-node3] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:20:31 -0500 (0:00:00.104) 0:05:09.331 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:20:36 -0500 (0:00:04.441) 0:05:13.773 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:20:36 -0500 (0:00:00.069) 0:05:13.842 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:20:36 -0500 (0:00:00.035) 0:05:13.878 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:20:36 -0500 (0:00:00.041) 0:05:13.919 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:20:36 -0500 (0:00:00.034) 0:05:13.953 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:20:37 -0500 (0:00:00.631) 0:05:14.585 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:20:38 -0500 (0:00:00.968) 0:05:15.554 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:20:38 -0500 (0:00:00.055) 0:05:15.609 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:20:38 -0500 (0:00:00.035) 0:05:15.645 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:20:42 -0500 (0:00:03.984) 0:05:19.629 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.052) 0:05:19.682 **** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.040) 0:05:19.723 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.135) 0:05:19.858 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.078) 0:05:19.937 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:355 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.056) 0:05:19.993 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.119) 0:05:20.113 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.090) 0:05:20.204 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.068) 0:05:20.272 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.138) 0:05:20.411 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:20:42 -0500 (0:00:00.074) 0:05:20.486 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:20:43 -0500 (0:00:00.057) 0:05:20.543 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:20:43 -0500 (0:00:00.057) 0:05:20.601 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:20:43 -0500 (0:00:00.058) 0:05:20.660 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:20:43 -0500 (0:00:00.133) 0:05:20.793 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:20:44 -0500 (0:00:01.180) 0:05:21.974 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:20:44 -0500 (0:00:00.055) 0:05:22.029 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:20:44 -0500 (0:00:00.039) 0:05:22.068 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:20:48 -0500 (0:00:04.183) 0:05:26.252 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:20:48 -0500 (0:00:00.072) 0:05:26.325 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:20:48 -0500 (0:00:00.052) 0:05:26.377 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:20:48 -0500 (0:00:00.057) 0:05:26.435 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:20:48 -0500 (0:00:00.048) 0:05:26.484 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:20:49 -0500 (0:00:00.721) 0:05:27.205 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:20:50 -0500 (0:00:01.045) 0:05:28.251 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:20:50 -0500 (0:00:00.068) 0:05:28.319 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:20:50 -0500 (0:00:00.042) 0:05:28.362 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:21:02 -0500 (0:00:11.233) 0:05:39.595 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:21:02 -0500 (0:00:00.056) 0:05:39.651 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948413.7385998, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "870f1cf1ba31ed7600eba9f4e2762e5655a8c948", "ctime": 1733948413.7345998, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948413.7345998, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:21:02 -0500 (0:00:00.421) 0:05:40.073 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:21:03 -0500 (0:00:00.459) 0:05:40.532 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:21:03 -0500 (0:00:00.061) 0:05:40.594 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:21:03 -0500 (0:00:00.259) 0:05:40.853 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:21:03 -0500 (0:00:00.075) 0:05:40.929 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:21:03 -0500 (0:00:00.073) 0:05:41.002 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:21:03 -0500 (0:00:00.441) 0:05:41.444 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:21:04 -0500 (0:00:00.513) 0:05:41.958 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:21:04 -0500 (0:00:00.445) 0:05:42.403 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:21:04 -0500 (0:00:00.070) 0:05:42.474 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:21:05 -0500 (0:00:00.525) 0:05:42.999 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948417.5076036, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "a2262a271b1dd566d16d2b1fd12a9effa8850603", "ctime": 1733948415.1916013, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263819, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948415.1906013, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "18446744071595675906", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:21:05 -0500 (0:00:00.438) 0:05:43.438 **** changed: [managed-node3] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node3] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:21:06 -0500 (0:00:00.692) 0:05:44.131 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:374 Wednesday 11 December 2024 15:21:07 -0500 (0:00:00.734) 0:05:44.865 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:21:07 -0500 (0:00:00.103) 0:05:44.969 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:21:07 -0500 (0:00:00.078) 0:05:45.048 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:21:07 -0500 (0:00:00.056) 0:05:45.105 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "050796e0-a1f4-49fd-8f0b-888f6c1b105e" }, "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "size": "4G", "type": "crypt", "uuid": "ee875210-fe5c-4da7-8fe9-a6e3c07e5f17" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:21:07 -0500 (0:00:00.365) 0:05:45.470 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002819", "end": "2024-12-11 15:21:08.253091", "rc": 0, "start": "2024-12-11 15:21:08.250272" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:21:08 -0500 (0:00:00.387) 0:05:45.858 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002826", "end": "2024-12-11 15:21:08.622955", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:21:08.620129" } STDOUT: luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:21:08 -0500 (0:00:00.348) 0:05:46.206 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:21:08 -0500 (0:00:00.102) 0:05:46.309 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:21:08 -0500 (0:00:00.044) 0:05:46.354 **** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017785", "end": "2024-12-11 15:21:09.167979", "rc": 0, "start": "2024-12-11 15:21:09.150194" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:21:09 -0500 (0:00:00.397) 0:05:46.752 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:21:09 -0500 (0:00:00.052) 0:05:46.804 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:21:09 -0500 (0:00:00.084) 0:05:46.889 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:21:09 -0500 (0:00:00.049) 0:05:46.938 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:21:09 -0500 (0:00:00.555) 0:05:47.494 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.068) 0:05:47.562 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.067) 0:05:47.629 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.059) 0:05:47.689 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.057) 0:05:47.746 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.056) 0:05:47.802 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.041) 0:05:47.843 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.057) 0:05:47.900 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.265) 0:05:48.166 **** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.063) 0:05:48.229 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.178) 0:05:48.408 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.038) 0:05:48.447 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:21:10 -0500 (0:00:00.037) 0:05:48.485 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.044) 0:05:48.529 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.049) 0:05:48.579 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.062) 0:05:48.642 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.057) 0:05:48.699 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.058) 0:05:48.757 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.058) 0:05:48.816 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.058) 0:05:48.875 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.057) 0:05:48.932 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.060) 0:05:48.993 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.122) 0:05:49.116 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.107) 0:05:49.224 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.044) 0:05:49.268 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.047) 0:05:49.315 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.048) 0:05:49.364 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.038) 0:05:49.403 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.043) 0:05:49.446 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 11 December 2024 15:21:11 -0500 (0:00:00.058) 0:05:49.505 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.056) 0:05:49.561 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.112) 0:05:49.673 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.103) 0:05:49.777 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.042) 0:05:49.820 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.036) 0:05:49.857 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.037) 0:05:49.894 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.038) 0:05:49.932 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.086) 0:05:50.019 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.045) 0:05:50.064 **** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.043) 0:05:50.107 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.077) 0:05:50.184 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.053) 0:05:50.238 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.061) 0:05:50.300 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.058) 0:05:50.358 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.060) 0:05:50.418 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 11 December 2024 15:21:12 -0500 (0:00:00.060) 0:05:50.478 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.060) 0:05:50.538 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.058) 0:05:50.597 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.129) 0:05:50.727 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.105) 0:05:50.833 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.041) 0:05:50.875 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.036) 0:05:50.911 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.038) 0:05:50.951 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.037) 0:05:50.988 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.041) 0:05:51.030 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.037) 0:05:51.068 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.039) 0:05:51.107 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.118) 0:05:51.226 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.133) 0:05:51.359 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.058) 0:05:51.418 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:21:13 -0500 (0:00:00.070) 0:05:51.488 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.051) 0:05:51.540 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.044) 0:05:51.585 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.039) 0:05:51.624 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.040) 0:05:51.664 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.087) 0:05:51.752 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.075) 0:05:51.827 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.271) 0:05:52.099 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.068) 0:05:52.167 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.061) 0:05:52.229 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.047) 0:05:52.277 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.062) 0:05:52.340 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.044) 0:05:52.385 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.041) 0:05:52.426 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.038) 0:05:52.465 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:21:14 -0500 (0:00:00.041) 0:05:52.506 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.047) 0:05:52.553 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.037) 0:05:52.591 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.038) 0:05:52.630 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.069) 0:05:52.700 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.047) 0:05:52.747 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.045) 0:05:52.793 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.038) 0:05:52.832 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.043) 0:05:52.875 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.040) 0:05:52.917 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.057) 0:05:52.974 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.078) 0:05:53.052 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948461.801649, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948461.801649, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 444491, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948461.801649, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.388) 0:05:53.441 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:21:15 -0500 (0:00:00.073) 0:05:53.514 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:21:16 -0500 (0:00:00.066) 0:05:53.581 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:21:16 -0500 (0:00:00.089) 0:05:53.671 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:21:16 -0500 (0:00:00.078) 0:05:53.750 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:21:16 -0500 (0:00:00.071) 0:05:53.821 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:21:16 -0500 (0:00:00.068) 0:05:53.890 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948461.9256492, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948461.9256492, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 444339, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948461.9256492, "nlink": 1, "path": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:21:16 -0500 (0:00:00.498) 0:05:54.389 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:21:17 -0500 (0:00:00.688) 0:05:55.078 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026731", "end": "2024-12-11 15:21:18.052697", "rc": 0, "start": "2024-12-11 15:21:18.025966" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 30 78 9f 4a fe 86 c3 46 42 ca bb 25 73 91 c5 01 2b 8b 54 d7 MK salt: a1 81 4a 31 68 fc e7 26 a7 43 73 08 b9 12 0a 5a 95 8c 85 b5 eb 9a 86 66 ca 4f f1 be 6e e5 2e 12 MK iterations: 23173 UUID: 050796e0-a1f4-49fd-8f0b-888f6c1b105e Key Slot 0: ENABLED Iterations: 371834 Salt: f3 c1 b5 99 22 98 57 22 14 04 eb 4b 97 fa 8b 6b 05 65 30 49 b4 11 33 02 b5 2c c6 39 db c5 95 55 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.595) 0:05:55.673 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.076) 0:05:55.750 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.084) 0:05:55.834 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.077) 0:05:55.911 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.084) 0:05:55.996 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.071) 0:05:56.067 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.081) 0:05:56.149 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.076) 0:05:56.226 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.049) 0:05:56.275 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.046) 0:05:56.321 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.057) 0:05:56.379 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:21:18 -0500 (0:00:00.067) 0:05:56.446 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.156) 0:05:56.603 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.054) 0:05:56.658 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.076) 0:05:56.735 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.074) 0:05:56.810 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.051) 0:05:56.861 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.050) 0:05:56.912 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.067) 0:05:56.979 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.050) 0:05:57.030 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.049) 0:05:57.080 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.046) 0:05:57.127 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.039) 0:05:57.166 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:21:19 -0500 (0:00:00.047) 0:05:57.214 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:21:20 -0500 (0:00:00.583) 0:05:57.798 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:21:20 -0500 (0:00:00.531) 0:05:58.330 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:21:20 -0500 (0:00:00.099) 0:05:58.430 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:21:20 -0500 (0:00:00.066) 0:05:58.496 **** ok: [managed-node3] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:21:21 -0500 (0:00:00.430) 0:05:58.927 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:21:21 -0500 (0:00:00.131) 0:05:59.058 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:21:21 -0500 (0:00:00.141) 0:05:59.200 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:21:21 -0500 (0:00:00.135) 0:05:59.335 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:21:21 -0500 (0:00:00.128) 0:05:59.463 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.068) 0:05:59.531 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.090) 0:05:59.622 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.116) 0:05:59.738 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.067) 0:05:59.806 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.083) 0:05:59.889 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.072) 0:05:59.961 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.092) 0:06:00.054 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.059) 0:06:00.113 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.063) 0:06:00.176 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.064) 0:06:00.241 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.058) 0:06:00.299 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.059) 0:06:00.359 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.061) 0:06:00.420 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:21:22 -0500 (0:00:00.057) 0:06:00.477 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.063) 0:06:00.541 **** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.065) 0:06:00.606 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.064) 0:06:00.670 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.077) 0:06:00.748 **** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.018384", "end": "2024-12-11 15:21:23.553973", "rc": 0, "start": "2024-12-11 15:21:23.535589" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.480) 0:06:01.229 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.092) 0:06:01.321 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.112) 0:06:01.434 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:21:23 -0500 (0:00:00.072) 0:06:01.506 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.069) 0:06:01.576 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.070) 0:06:01.646 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.080) 0:06:01.727 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.086) 0:06:01.814 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.055) 0:06:01.870 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:377 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.082) 0:06:01.952 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.111) 0:06:02.064 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.080) 0:06:02.144 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.050) 0:06:02.195 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.128) 0:06:02.323 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.057) 0:06:02.381 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.059) 0:06:02.440 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:21:24 -0500 (0:00:00.059) 0:06:02.500 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:21:25 -0500 (0:00:00.063) 0:06:02.563 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:21:25 -0500 (0:00:00.259) 0:06:02.822 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:21:26 -0500 (0:00:01.249) 0:06:04.072 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:21:26 -0500 (0:00:00.069) 0:06:04.142 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:21:26 -0500 (0:00:00.070) 0:06:04.212 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:21:30 -0500 (0:00:04.292) 0:06:08.504 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:21:31 -0500 (0:00:00.068) 0:06:08.573 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:21:31 -0500 (0:00:00.038) 0:06:08.611 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:21:31 -0500 (0:00:00.046) 0:06:08.658 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:21:31 -0500 (0:00:00.052) 0:06:08.711 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:21:31 -0500 (0:00:00.784) 0:06:09.496 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service": { "name": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:21:33 -0500 (0:00:01.040) 0:06:10.536 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:21:33 -0500 (0:00:00.058) 0:06:10.595 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d4e842f38\x2de3a6\x2d49fb\x2dafb8\x2d8a35bb31f6d1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "name": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice dev-sda1.device systemd-journald.socket systemd-readahead-collect.service systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4e842f38-e3a6-49fb-afb8-8a35bb31f6d1 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:21:33 -0500 (0:00:00.507) 0:06:11.103 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:21:37 -0500 (0:00:04.184) 0:06:15.287 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:21:37 -0500 (0:00:00.058) 0:06:15.346 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948464.793652, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e54ab630b87ec152c14c386d9417b4af51d33d34", "ctime": 1733948464.790652, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948464.790652, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:21:38 -0500 (0:00:00.414) 0:06:15.761 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:21:38 -0500 (0:00:00.069) 0:06:15.830 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d4e842f38\x2de3a6\x2d49fb\x2dafb8\x2d8a35bb31f6d1.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "name": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4e842f38\\x2de3a6\\x2d49fb\\x2dafb8\\x2d8a35bb31f6d1.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:21:38 -0500 (0:00:00.551) 0:06:16.382 **** ok: [managed-node3] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:21:38 -0500 (0:00:00.078) 0:06:16.460 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:21:39 -0500 (0:00:00.059) 0:06:16.520 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:21:39 -0500 (0:00:00.058) 0:06:16.579 **** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:21:39 -0500 (0:00:00.082) 0:06:16.661 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:21:39 -0500 (0:00:00.491) 0:06:17.153 **** ok: [managed-node3] => (item={u'src': u'/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:21:40 -0500 (0:00:00.480) 0:06:17.633 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:21:40 -0500 (0:00:00.050) 0:06:17.684 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:21:40 -0500 (0:00:00.526) 0:06:18.211 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948468.621656, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "42d2e5a11043d088476291fbe56403001d2d7a56", "ctime": 1733948466.5386539, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263819, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948466.537654, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071595676072", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:21:41 -0500 (0:00:00.479) 0:06:18.690 **** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:21:41 -0500 (0:00:00.116) 0:06:18.807 **** ok: [managed-node3] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Wednesday 11 December 2024 15:21:42 -0500 (0:00:00.784) 0:06:19.591 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:398 Wednesday 11 December 2024 15:21:42 -0500 (0:00:00.092) 0:06:19.684 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:21:42 -0500 (0:00:00.125) 0:06:19.809 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:21:42 -0500 (0:00:00.063) 0:06:19.873 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:21:42 -0500 (0:00:00.038) 0:06:19.912 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "050796e0-a1f4-49fd-8f0b-888f6c1b105e" }, "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "size": "4G", "type": "crypt", "uuid": "ee875210-fe5c-4da7-8fe9-a6e3c07e5f17" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:21:42 -0500 (0:00:00.444) 0:06:20.356 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003632", "end": "2024-12-11 15:21:43.146608", "rc": 0, "start": "2024-12-11 15:21:43.142976" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:21:43 -0500 (0:00:00.385) 0:06:20.742 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002842", "end": "2024-12-11 15:21:43.512031", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:21:43.509189" } STDOUT: luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:21:43 -0500 (0:00:00.378) 0:06:21.121 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:21:43 -0500 (0:00:00.106) 0:06:21.227 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:21:43 -0500 (0:00:00.045) 0:06:21.272 **** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018662", "end": "2024-12-11 15:21:44.054677", "rc": 0, "start": "2024-12-11 15:21:44.036015" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:21:44 -0500 (0:00:00.387) 0:06:21.660 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:21:44 -0500 (0:00:00.081) 0:06:21.741 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:21:44 -0500 (0:00:00.116) 0:06:21.858 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:21:44 -0500 (0:00:00.095) 0:06:21.954 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:21:44 -0500 (0:00:00.529) 0:06:22.483 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.089) 0:06:22.573 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.072) 0:06:22.645 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.095) 0:06:22.741 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.081) 0:06:22.823 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.071) 0:06:22.894 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.055) 0:06:22.950 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.113) 0:06:23.063 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:21:45 -0500 (0:00:00.329) 0:06:23.393 **** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.126) 0:06:23.519 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.174) 0:06:23.694 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.074) 0:06:23.768 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.074) 0:06:23.843 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.059) 0:06:23.902 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.066) 0:06:23.969 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.119) 0:06:24.089 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.082) 0:06:24.171 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.061) 0:06:24.233 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.086) 0:06:24.319 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.057) 0:06:24.377 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.059) 0:06:24.436 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:21:46 -0500 (0:00:00.059) 0:06:24.495 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:21:47 -0500 (0:00:00.170) 0:06:24.666 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 11 December 2024 15:21:47 -0500 (0:00:00.418) 0:06:25.084 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 11 December 2024 15:21:47 -0500 (0:00:00.077) 0:06:25.162 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 11 December 2024 15:21:47 -0500 (0:00:00.089) 0:06:25.251 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 11 December 2024 15:21:47 -0500 (0:00:00.085) 0:06:25.337 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 11 December 2024 15:21:47 -0500 (0:00:00.081) 0:06:25.419 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.143) 0:06:25.563 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.064) 0:06:25.627 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.103) 0:06:25.730 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.153) 0:06:25.884 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.133) 0:06:26.018 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.134) 0:06:26.153 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.137) 0:06:26.290 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.110) 0:06:26.401 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:21:48 -0500 (0:00:00.075) 0:06:26.477 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.226) 0:06:26.704 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.107) 0:06:26.811 **** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.078) 0:06:26.889 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.127) 0:06:27.017 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.074) 0:06:27.091 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.069) 0:06:27.160 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.063) 0:06:27.224 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.059) 0:06:27.284 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.064) 0:06:27.348 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:21:49 -0500 (0:00:00.101) 0:06:27.449 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.093) 0:06:27.543 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.269) 0:06:27.812 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.145) 0:06:27.958 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.090) 0:06:28.049 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.098) 0:06:28.148 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.101) 0:06:28.249 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.061) 0:06:28.311 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.060) 0:06:28.371 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.059) 0:06:28.431 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:21:50 -0500 (0:00:00.055) 0:06:28.486 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.177) 0:06:28.664 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.056) 0:06:28.720 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.061) 0:06:28.782 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.078) 0:06:28.860 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.074) 0:06:28.935 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.057) 0:06:28.992 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.058) 0:06:29.050 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.058) 0:06:29.109 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.115) 0:06:29.224 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:21:51 -0500 (0:00:00.089) 0:06:29.314 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.486) 0:06:29.801 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.106) 0:06:29.907 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.140) 0:06:30.048 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.100) 0:06:30.148 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.083) 0:06:30.232 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.058) 0:06:30.291 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.074) 0:06:30.365 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:21:52 -0500 (0:00:00.096) 0:06:30.462 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.082) 0:06:30.544 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.079) 0:06:30.624 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.077) 0:06:30.702 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.101) 0:06:30.804 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.120) 0:06:30.924 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.068) 0:06:30.992 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.078) 0:06:31.071 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.059) 0:06:31.130 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.105) 0:06:31.235 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.066) 0:06:31.302 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:21:53 -0500 (0:00:00.144) 0:06:31.446 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:21:54 -0500 (0:00:00.164) 0:06:31.611 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948478.0416658, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948461.801649, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 444491, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948461.801649, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:21:54 -0500 (0:00:00.612) 0:06:32.223 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:21:54 -0500 (0:00:00.128) 0:06:32.352 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:21:54 -0500 (0:00:00.093) 0:06:32.445 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:21:55 -0500 (0:00:00.120) 0:06:32.566 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:21:55 -0500 (0:00:00.091) 0:06:32.657 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:21:55 -0500 (0:00:00.081) 0:06:32.739 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:21:55 -0500 (0:00:00.113) 0:06:32.853 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948461.9256492, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948461.9256492, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 444339, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948461.9256492, "nlink": 1, "path": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:21:55 -0500 (0:00:00.478) 0:06:33.331 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:21:56 -0500 (0:00:00.935) 0:06:34.267 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026241", "end": "2024-12-11 15:21:57.132973", "rc": 0, "start": "2024-12-11 15:21:57.106732" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 30 78 9f 4a fe 86 c3 46 42 ca bb 25 73 91 c5 01 2b 8b 54 d7 MK salt: a1 81 4a 31 68 fc e7 26 a7 43 73 08 b9 12 0a 5a 95 8c 85 b5 eb 9a 86 66 ca 4f f1 be 6e e5 2e 12 MK iterations: 23173 UUID: 050796e0-a1f4-49fd-8f0b-888f6c1b105e Key Slot 0: ENABLED Iterations: 371834 Salt: f3 c1 b5 99 22 98 57 22 14 04 eb 4b 97 fa 8b 6b 05 65 30 49 b4 11 33 02 b5 2c c6 39 db c5 95 55 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.477) 0:06:34.744 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.096) 0:06:34.840 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.133) 0:06:34.973 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.081) 0:06:35.055 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.092) 0:06:35.148 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.074) 0:06:35.222 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.119) 0:06:35.341 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:21:57 -0500 (0:00:00.096) 0:06:35.438 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.111) 0:06:35.549 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.067) 0:06:35.617 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.070) 0:06:35.688 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.092) 0:06:35.780 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.077) 0:06:35.858 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.067) 0:06:35.926 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.058) 0:06:35.984 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.117) 0:06:36.102 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.067) 0:06:36.169 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.141) 0:06:36.311 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.077) 0:06:36.388 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:21:58 -0500 (0:00:00.061) 0:06:36.450 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:21:59 -0500 (0:00:00.075) 0:06:36.525 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:21:59 -0500 (0:00:00.059) 0:06:36.585 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:21:59 -0500 (0:00:00.067) 0:06:36.653 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:21:59 -0500 (0:00:00.056) 0:06:36.710 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:21:59 -0500 (0:00:00.662) 0:06:37.372 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:22:00 -0500 (0:00:00.502) 0:06:37.874 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:22:00 -0500 (0:00:00.099) 0:06:37.974 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:22:00 -0500 (0:00:00.050) 0:06:38.024 **** ok: [managed-node3] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:22:00 -0500 (0:00:00.399) 0:06:38.424 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:22:00 -0500 (0:00:00.064) 0:06:38.488 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.059) 0:06:38.547 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.055) 0:06:38.603 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.053) 0:06:38.657 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.046) 0:06:38.703 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.038) 0:06:38.742 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.037) 0:06:38.779 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.050) 0:06:38.830 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.145) 0:06:38.976 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.061) 0:06:39.037 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.059) 0:06:39.096 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.060) 0:06:39.157 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.059) 0:06:39.217 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.057) 0:06:39.275 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.068) 0:06:39.343 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.067) 0:06:39.411 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:22:01 -0500 (0:00:00.059) 0:06:39.471 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.057) 0:06:39.528 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.055) 0:06:39.584 **** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.067) 0:06:39.651 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.067) 0:06:39.718 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.075) 0:06:39.794 **** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020648", "end": "2024-12-11 15:22:02.632197", "rc": 0, "start": "2024-12-11 15:22:02.611549" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.440) 0:06:40.235 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.061) 0:06:40.297 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.065) 0:06:40.362 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.043) 0:06:40.406 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.054) 0:06:40.460 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:22:02 -0500 (0:00:00.044) 0:06:40.505 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.051) 0:06:40.556 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.064) 0:06:40.621 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.054) 0:06:40.676 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.052) 0:06:40.728 **** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.420) 0:06:41.149 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.118) 0:06:41.268 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.073) 0:06:41.342 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:22:03 -0500 (0:00:00.098) 0:06:41.440 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.111) 0:06:41.551 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.071) 0:06:41.623 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.134) 0:06:41.757 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.060) 0:06:41.818 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.056) 0:06:41.875 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.055) 0:06:41.930 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.059) 0:06:41.989 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:22:04 -0500 (0:00:00.139) 0:06:42.129 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:22:05 -0500 (0:00:01.242) 0:06:43.372 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:22:05 -0500 (0:00:00.128) 0:06:43.500 **** ok: [managed-node3] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:22:06 -0500 (0:00:00.132) 0:06:43.633 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:22:10 -0500 (0:00:04.081) 0:06:47.714 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:22:10 -0500 (0:00:00.122) 0:06:47.837 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:22:10 -0500 (0:00:00.093) 0:06:47.931 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:22:10 -0500 (0:00:00.062) 0:06:47.993 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:22:10 -0500 (0:00:00.058) 0:06:48.052 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:22:11 -0500 (0:00:01.067) 0:06:49.120 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service": { "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:22:13 -0500 (0:00:01.523) 0:06:50.644 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:22:13 -0500 (0:00:00.197) 0:06:50.842 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d050796e0\x2da1f4\x2d49fd\x2d8f0b\x2d888f6c1b105e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-replay.service cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service dev-mapper-foo\\x2dtest1.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:22:13 -0500 (0:00:00.543) 0:06:51.385 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:22:18 -0500 (0:00:04.455) 0:06:55.841 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:22:18 -0500 (0:00:00.093) 0:06:55.934 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d050796e0\x2da1f4\x2d49fd\x2d8f0b\x2d888f6c1b105e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:22:19 -0500 (0:00:00.692) 0:06:56.627 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:22:19 -0500 (0:00:00.074) 0:06:56.701 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:22:19 -0500 (0:00:00.104) 0:06:56.806 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 11 December 2024 15:22:19 -0500 (0:00:00.067) 0:06:56.873 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948523.5377123, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948523.5377123, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1733948523.5377123, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071669022016", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 11 December 2024 15:22:19 -0500 (0:00:00.528) 0:06:57.402 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:427 Wednesday 11 December 2024 15:22:19 -0500 (0:00:00.070) 0:06:57.473 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.164) 0:06:57.637 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.072) 0:06:57.710 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.048) 0:06:57.759 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.092) 0:06:57.852 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.037) 0:06:57.889 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.037) 0:06:57.926 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.045) 0:06:57.971 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.061) 0:06:58.032 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:22:20 -0500 (0:00:00.109) 0:06:58.142 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:22:21 -0500 (0:00:01.312) 0:06:59.455 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:22:22 -0500 (0:00:00.080) 0:06:59.535 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:22:22 -0500 (0:00:00.062) 0:06:59.597 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:22:26 -0500 (0:00:04.237) 0:07:03.835 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:22:26 -0500 (0:00:00.147) 0:07:03.982 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:22:26 -0500 (0:00:00.087) 0:07:04.069 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:22:26 -0500 (0:00:00.058) 0:07:04.128 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:22:26 -0500 (0:00:00.053) 0:07:04.181 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:22:27 -0500 (0:00:00.914) 0:07:05.095 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service": { "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:22:28 -0500 (0:00:01.223) 0:07:06.319 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:22:28 -0500 (0:00:00.179) 0:07:06.498 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d050796e0\x2da1f4\x2d49fd\x2d8f0b\x2d888f6c1b105e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-collect.service cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice dev-mapper-foo\\x2dtest1.device systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:22:29 -0500 (0:00:00.685) 0:07:07.184 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:23:34 -0500 (0:01:05.214) 0:08:12.398 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:23:35 -0500 (0:00:00.122) 0:08:12.520 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948464.793652, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "e54ab630b87ec152c14c386d9417b4af51d33d34", "ctime": 1733948464.790652, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948464.790652, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:23:35 -0500 (0:00:00.671) 0:08:13.191 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:23:36 -0500 (0:00:00.638) 0:08:13.830 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d050796e0\x2da1f4\x2d49fd\x2d8f0b\x2d888f6c1b105e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:23:37 -0500 (0:00:00.817) 0:08:14.647 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:23:37 -0500 (0:00:00.128) 0:08:14.775 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:23:37 -0500 (0:00:00.102) 0:08:14.878 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:23:37 -0500 (0:00:00.065) 0:08:14.944 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:23:38 -0500 (0:00:00.716) 0:08:15.661 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:23:38 -0500 (0:00:00.823) 0:08:16.484 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:23:39 -0500 (0:00:00.437) 0:08:16.921 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:23:39 -0500 (0:00:00.062) 0:08:16.983 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:23:39 -0500 (0:00:00.510) 0:08:17.494 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948468.621656, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "42d2e5a11043d088476291fbe56403001d2d7a56", "ctime": 1733948466.5386539, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263819, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948466.537654, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071595676072", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:23:40 -0500 (0:00:00.386) 0:08:17.881 **** changed: [managed-node3] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:23:40 -0500 (0:00:00.390) 0:08:18.271 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:443 Wednesday 11 December 2024 15:23:41 -0500 (0:00:00.827) 0:08:19.099 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:23:41 -0500 (0:00:00.120) 0:08:19.220 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:23:41 -0500 (0:00:00.074) 0:08:19.294 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:23:41 -0500 (0:00:00.055) 0:08:19.350 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "15e982d7-f015-4d86-9c20-7aaed862bd8c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:23:42 -0500 (0:00:00.422) 0:08:19.773 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002998", "end": "2024-12-11 15:23:42.569463", "rc": 0, "start": "2024-12-11 15:23:42.566465" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:23:42 -0500 (0:00:00.396) 0:08:20.169 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002862", "end": "2024-12-11 15:23:42.977710", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:23:42.974848" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:23:43 -0500 (0:00:00.417) 0:08:20.587 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:23:43 -0500 (0:00:00.128) 0:08:20.716 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:23:43 -0500 (0:00:00.058) 0:08:20.774 **** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018348", "end": "2024-12-11 15:23:43.727087", "rc": 0, "start": "2024-12-11 15:23:43.708739" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:23:43 -0500 (0:00:00.629) 0:08:21.404 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:23:43 -0500 (0:00:00.086) 0:08:21.491 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:23:44 -0500 (0:00:00.178) 0:08:21.670 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:23:44 -0500 (0:00:00.085) 0:08:21.755 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:23:44 -0500 (0:00:00.636) 0:08:22.391 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:23:44 -0500 (0:00:00.063) 0:08:22.454 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.087) 0:08:22.541 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.080) 0:08:22.622 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.080) 0:08:22.702 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.133) 0:08:22.835 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.084) 0:08:22.920 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.097) 0:08:23.017 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.294) 0:08:23.312 **** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.050) 0:08:23.363 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:23:45 -0500 (0:00:00.113) 0:08:23.476 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.078) 0:08:23.555 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.056) 0:08:23.612 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.072) 0:08:23.684 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.066) 0:08:23.750 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.056) 0:08:23.806 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.055) 0:08:23.862 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.070) 0:08:23.933 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.066) 0:08:23.999 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.065) 0:08:24.065 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.063) 0:08:24.128 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.070) 0:08:24.198 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.143) 0:08:24.342 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 11 December 2024 15:23:46 -0500 (0:00:00.137) 0:08:24.479 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.060) 0:08:24.540 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.055) 0:08:24.596 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.059) 0:08:24.656 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.054) 0:08:24.710 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.042) 0:08:24.753 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.046) 0:08:24.799 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.072) 0:08:24.872 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.164) 0:08:25.037 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.125) 0:08:25.163 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.075) 0:08:25.238 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.057) 0:08:25.295 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.061) 0:08:25.356 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:23:47 -0500 (0:00:00.063) 0:08:25.420 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.124) 0:08:25.544 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.061) 0:08:25.606 **** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.073) 0:08:25.680 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.195) 0:08:25.875 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.079) 0:08:25.954 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.067) 0:08:26.021 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.057) 0:08:26.079 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.150) 0:08:26.229 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.047) 0:08:26.277 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.048) 0:08:26.326 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.041) 0:08:26.367 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:23:48 -0500 (0:00:00.101) 0:08:26.469 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.112) 0:08:26.582 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.048) 0:08:26.630 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.047) 0:08:26.677 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.053) 0:08:26.731 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.055) 0:08:26.786 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.038) 0:08:26.825 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.041) 0:08:26.867 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.049) 0:08:26.917 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.131) 0:08:27.048 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.046) 0:08:27.094 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.043) 0:08:27.138 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.042) 0:08:27.181 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.059) 0:08:27.241 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.058) 0:08:27.299 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.096) 0:08:27.395 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:23:49 -0500 (0:00:00.110) 0:08:27.506 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.127) 0:08:27.633 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.098) 0:08:27.732 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.339) 0:08:28.072 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.076) 0:08:28.148 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.074) 0:08:28.223 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.056) 0:08:28.280 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.075) 0:08:28.355 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.070) 0:08:28.425 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:23:50 -0500 (0:00:00.058) 0:08:28.483 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.124) 0:08:28.608 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.099) 0:08:28.708 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.057) 0:08:28.765 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.083) 0:08:28.848 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.132) 0:08:28.981 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.191) 0:08:29.173 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.076) 0:08:29.249 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.083) 0:08:29.332 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.053) 0:08:29.386 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:23:51 -0500 (0:00:00.084) 0:08:29.470 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.061) 0:08:29.531 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.069) 0:08:29.601 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.073) 0:08:29.674 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948614.703806, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948614.703806, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 465826, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948614.703806, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.465) 0:08:30.139 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.076) 0:08:30.216 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.059) 0:08:30.275 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.071) 0:08:30.346 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.067) 0:08:30.414 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:23:52 -0500 (0:00:00.058) 0:08:30.472 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:23:53 -0500 (0:00:00.076) 0:08:30.548 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:23:53 -0500 (0:00:00.137) 0:08:30.686 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:23:53 -0500 (0:00:00.710) 0:08:31.397 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:23:53 -0500 (0:00:00.068) 0:08:31.465 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.071) 0:08:31.537 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.074) 0:08:31.611 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.060) 0:08:31.672 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.057) 0:08:31.730 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.058) 0:08:31.789 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.057) 0:08:31.847 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.055) 0:08:31.902 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.088) 0:08:31.990 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.077) 0:08:32.068 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.109) 0:08:32.178 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.096) 0:08:32.274 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.078) 0:08:32.352 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.065) 0:08:32.418 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:23:54 -0500 (0:00:00.074) 0:08:32.493 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.070) 0:08:32.563 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.059) 0:08:32.622 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.060) 0:08:32.683 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.064) 0:08:32.747 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.059) 0:08:32.807 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.105) 0:08:32.912 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.091) 0:08:33.004 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.095) 0:08:33.100 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:23:55 -0500 (0:00:00.099) 0:08:33.200 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:23:56 -0500 (0:00:00.621) 0:08:33.821 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:23:56 -0500 (0:00:00.534) 0:08:34.356 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:23:56 -0500 (0:00:00.086) 0:08:34.443 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:23:56 -0500 (0:00:00.063) 0:08:34.507 **** ok: [managed-node3] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.491) 0:08:34.998 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.068) 0:08:35.067 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.087) 0:08:35.155 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.076) 0:08:35.231 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.053) 0:08:35.285 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.049) 0:08:35.335 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.046) 0:08:35.382 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.042) 0:08:35.424 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:23:57 -0500 (0:00:00.058) 0:08:35.482 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.059) 0:08:35.542 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.058) 0:08:35.601 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.057) 0:08:35.659 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.069) 0:08:35.728 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.085) 0:08:35.814 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.088) 0:08:35.903 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.113) 0:08:36.016 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.121) 0:08:36.138 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.142) 0:08:36.280 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.088) 0:08:36.369 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:23:58 -0500 (0:00:00.053) 0:08:36.423 **** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.121) 0:08:36.544 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.076) 0:08:36.621 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.088) 0:08:36.709 **** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.021383", "end": "2024-12-11 15:23:59.650017", "rc": 0, "start": "2024-12-11 15:23:59.628634" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.545) 0:08:37.255 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.073) 0:08:37.329 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.074) 0:08:37.403 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:23:59 -0500 (0:00:00.062) 0:08:37.466 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.081) 0:08:37.548 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.093) 0:08:37.642 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.108) 0:08:37.750 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.087) 0:08:37.838 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.057) 0:08:37.896 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.069) 0:08:37.966 **** changed: [managed-node3] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:449 Wednesday 11 December 2024 15:24:00 -0500 (0:00:00.506) 0:08:38.472 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node3 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.124) 0:08:38.596 **** ok: [managed-node3] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.058) 0:08:38.655 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.186) 0:08:38.842 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.093) 0:08:38.936 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.068) 0:08:39.004 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.139) 0:08:39.143 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.063) 0:08:39.207 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.053) 0:08:39.261 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.058) 0:08:39.319 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:24:01 -0500 (0:00:00.065) 0:08:39.385 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:24:02 -0500 (0:00:00.135) 0:08:39.520 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:24:03 -0500 (0:00:01.280) 0:08:40.801 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:24:03 -0500 (0:00:00.050) 0:08:40.851 **** ok: [managed-node3] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:24:03 -0500 (0:00:00.046) 0:08:40.898 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:24:07 -0500 (0:00:04.067) 0:08:44.965 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:24:07 -0500 (0:00:00.099) 0:08:45.065 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:24:07 -0500 (0:00:00.054) 0:08:45.120 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:24:07 -0500 (0:00:00.056) 0:08:45.176 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:24:07 -0500 (0:00:00.056) 0:08:45.233 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:24:08 -0500 (0:00:00.754) 0:08:45.987 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service": { "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:24:09 -0500 (0:00:01.027) 0:08:47.015 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:24:09 -0500 (0:00:00.058) 0:08:47.074 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d050796e0\x2da1f4\x2d49fd\x2d8f0b\x2d888f6c1b105e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "cryptsetup-pre.target dev-mapper-foo\\x2dtest1.device systemd-readahead-collect.service systemd-journald.socket system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-050796e0-a1f4-49fd-8f0b-888f6c1b105e ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:24:10 -0500 (0:00:00.532) 0:08:47.607 **** fatal: [managed-node3]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Wednesday 11 December 2024 15:24:14 -0500 (0:00:04.094) 0:08:51.701 **** fatal: [managed-node3]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:24:14 -0500 (0:00:00.079) 0:08:51.780 **** changed: [managed-node3] => (item=systemd-cryptsetup@luks\x2d050796e0\x2da1f4\x2d49fd\x2d8f0b\x2d888f6c1b105e.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "name": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d050796e0\\x2da1f4\\x2d49fd\\x2d8f0b\\x2d888f6c1b105e.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Wednesday 11 December 2024 15:24:14 -0500 (0:00:00.537) 0:08:52.318 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Wednesday 11 December 2024 15:24:14 -0500 (0:00:00.042) 0:08:52.361 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Wednesday 11 December 2024 15:24:14 -0500 (0:00:00.059) 0:08:52.421 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Wednesday 11 December 2024 15:24:14 -0500 (0:00:00.042) 0:08:52.463 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948640.8758264, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948640.8758264, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1733948640.8758264, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744073292117187", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.429) 0:08:52.893 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:472 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.056) 0:08:52.950 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.213) 0:08:53.164 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.061) 0:08:53.225 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.048) 0:08:53.274 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.098) 0:08:53.373 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.039) 0:08:53.413 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.038) 0:08:53.452 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:24:15 -0500 (0:00:00.041) 0:08:53.493 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:24:16 -0500 (0:00:00.039) 0:08:53.533 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:24:16 -0500 (0:00:00.096) 0:08:53.630 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:24:17 -0500 (0:00:01.193) 0:08:54.823 **** ok: [managed-node3] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:24:17 -0500 (0:00:00.097) 0:08:54.920 **** ok: [managed-node3] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:24:17 -0500 (0:00:00.088) 0:08:55.009 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:24:21 -0500 (0:00:03.900) 0:08:58.910 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:24:21 -0500 (0:00:00.095) 0:08:59.005 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:24:21 -0500 (0:00:00.059) 0:08:59.064 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:24:21 -0500 (0:00:00.069) 0:08:59.134 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:24:21 -0500 (0:00:00.058) 0:08:59.193 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:24:22 -0500 (0:00:00.840) 0:09:00.033 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:24:23 -0500 (0:00:01.162) 0:09:01.196 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:24:23 -0500 (0:00:00.083) 0:09:01.279 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:24:23 -0500 (0:00:00.067) 0:09:01.347 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:24:34 -0500 (0:00:10.941) 0:09:12.288 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:24:34 -0500 (0:00:00.100) 0:09:12.389 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948619.3088105, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fceedeef6c619b69ada96279531b69ed89734ba", "ctime": 1733948619.3058105, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948619.3058105, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1279, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:24:35 -0500 (0:00:00.485) 0:09:12.874 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:24:36 -0500 (0:00:00.818) 0:09:13.692 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:24:36 -0500 (0:00:00.085) 0:09:13.778 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:24:36 -0500 (0:00:00.116) 0:09:13.894 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:24:36 -0500 (0:00:00.124) 0:09:14.018 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:24:36 -0500 (0:00:00.108) 0:09:14.127 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:24:37 -0500 (0:00:00.628) 0:09:14.755 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:24:37 -0500 (0:00:00.668) 0:09:15.424 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:24:38 -0500 (0:00:00.692) 0:09:16.116 **** skipping: [managed-node3] => (item={u'src': u'/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:24:38 -0500 (0:00:00.076) 0:09:16.193 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:24:39 -0500 (0:00:00.640) 0:09:16.834 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948622.9768143, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1733948620.6518118, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263817, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1733948620.650812, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744071595676437", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:24:39 -0500 (0:00:00.626) 0:09:17.460 **** changed: [managed-node3] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:24:40 -0500 (0:00:00.479) 0:09:17.940 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:488 Wednesday 11 December 2024 15:24:42 -0500 (0:00:02.156) 0:09:20.096 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:24:42 -0500 (0:00:00.227) 0:09:20.324 **** ok: [managed-node3] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:24:42 -0500 (0:00:00.079) 0:09:20.403 **** skipping: [managed-node3] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:24:42 -0500 (0:00:00.085) 0:09:20.489 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "517d3f33-747e-42be-acc3-d4bf9fd40cdf" }, "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "size": "4G", "type": "crypt", "uuid": "b9090a05-0b9b-4d88-b41b-79f783f91896" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:24:43 -0500 (0:00:00.536) 0:09:21.026 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.003495", "end": "2024-12-11 15:24:43.866683", "rc": 0, "start": "2024-12-11 15:24:43.863188" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:24:44 -0500 (0:00:00.511) 0:09:21.538 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.003525", "end": "2024-12-11 15:24:44.428652", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:24:44.425127" } STDOUT: luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:24:44 -0500 (0:00:00.584) 0:09:22.122 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node3 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Wednesday 11 December 2024 15:24:44 -0500 (0:00:00.264) 0:09:22.387 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Wednesday 11 December 2024 15:24:44 -0500 (0:00:00.092) 0:09:22.480 **** ok: [managed-node3] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.020066", "end": "2024-12-11 15:24:45.424096", "rc": 0, "start": "2024-12-11 15:24:45.404030" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Wednesday 11 December 2024 15:24:45 -0500 (0:00:00.602) 0:09:23.082 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Wednesday 11 December 2024 15:24:45 -0500 (0:00:00.116) 0:09:23.199 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Wednesday 11 December 2024 15:24:45 -0500 (0:00:00.205) 0:09:23.405 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Wednesday 11 December 2024 15:24:45 -0500 (0:00:00.084) 0:09:23.490 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.464) 0:09:23.954 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.060) 0:09:24.014 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.053) 0:09:24.067 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.049) 0:09:24.116 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.044) 0:09:24.161 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.050) 0:09:24.211 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.054) 0:09:24.266 **** ok: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Wednesday 11 December 2024 15:24:46 -0500 (0:00:00.089) 0:09:24.355 **** ok: [managed-node3] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.43.66 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.314) 0:09:24.669 **** skipping: [managed-node3] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.071) 0:09:24.741 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node3 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.119) 0:09:24.860 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.079) 0:09:24.940 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.057) 0:09:24.998 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.059) 0:09:25.058 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.063) 0:09:25.122 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.065) 0:09:25.187 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.064) 0:09:25.252 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.057) 0:09:25.309 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.056) 0:09:25.365 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.059) 0:09:25.425 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Wednesday 11 December 2024 15:24:47 -0500 (0:00:00.059) 0:09:25.485 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.063) 0:09:25.549 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node3 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.217) 0:09:25.766 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node3 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.120) 0:09:25.887 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.060) 0:09:25.948 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.076) 0:09:26.025 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.061) 0:09:26.086 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.061) 0:09:26.148 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.057) 0:09:26.206 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.047) 0:09:26.253 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.051) 0:09:26.304 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node3 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.098) 0:09:26.403 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node3 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Wednesday 11 December 2024 15:24:48 -0500 (0:00:00.080) 0:09:26.484 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.039) 0:09:26.523 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.038) 0:09:26.562 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.038) 0:09:26.601 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.039) 0:09:26.640 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node3 TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.109) 0:09:26.750 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.046) 0:09:26.796 **** skipping: [managed-node3] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.045) 0:09:26.842 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node3 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.072) 0:09:26.915 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.045) 0:09:26.960 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.046) 0:09:27.007 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.049) 0:09:27.057 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.036) 0:09:27.094 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.038) 0:09:27.133 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.037) 0:09:27.170 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.042) 0:09:27.213 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node3 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.086) 0:09:27.300 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node3 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.084) 0:09:27.385 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.038) 0:09:27.423 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.039) 0:09:27.463 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Wednesday 11 December 2024 15:24:49 -0500 (0:00:00.038) 0:09:27.502 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.039) 0:09:27.541 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.035) 0:09:27.577 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.041) 0:09:27.619 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.039) 0:09:27.658 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node3 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.094) 0:09:27.752 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.041) 0:09:27.794 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.038) 0:09:27.832 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.036) 0:09:27.869 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.038) 0:09:27.907 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.038) 0:09:27.946 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.037) 0:09:27.983 **** ok: [managed-node3] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.042) 0:09:28.026 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.119) 0:09:28.145 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.046) 0:09:28.191 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:24:50 -0500 (0:00:00.321) 0:09:28.513 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.052) 0:09:28.566 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.053) 0:09:28.619 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.038) 0:09:28.658 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.048) 0:09:28.707 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.047) 0:09:28.755 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.064) 0:09:28.819 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.051) 0:09:28.870 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.047) 0:09:28.918 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.045) 0:09:28.963 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.048) 0:09:29.012 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.039) 0:09:29.051 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.079) 0:09:29.131 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.063) 0:09:29.195 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.069) 0:09:29.265 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.057) 0:09:29.322 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.064) 0:09:29.387 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:24:51 -0500 (0:00:00.064) 0:09:29.451 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.079) 0:09:29.530 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.069) 0:09:29.600 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948674.4998312, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948674.4998312, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 465826, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948674.4998312, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.370) 0:09:29.971 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.060) 0:09:30.031 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.046) 0:09:30.077 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.047) 0:09:30.125 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.043) 0:09:30.169 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.047) 0:09:30.217 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:24:52 -0500 (0:00:00.047) 0:09:30.264 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948674.6258311, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948674.6258311, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 477050, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1733948674.6258311, "nlink": 1, "path": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:24:53 -0500 (0:00:00.380) 0:09:30.645 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:24:53 -0500 (0:00:00.672) 0:09:31.317 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.026300", "end": "2024-12-11 15:24:54.173277", "rc": 0, "start": "2024-12-11 15:24:54.146977" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: ab 6f 28 66 cb 92 34 c6 da 6d 99 5a 3f a8 61 bf 00 0d 3f fb MK salt: c3 59 28 6b d6 1b c6 58 22 45 21 a0 d7 0c 67 c8 12 f2 66 9e 51 49 a1 45 53 1d 81 4a 1b 5f 64 ad MK iterations: 23043 UUID: 517d3f33-747e-42be-acc3-d4bf9fd40cdf Key Slot 0: ENABLED Iterations: 368696 Salt: 1f f4 85 df b3 01 f3 e1 a8 2a 33 94 6c 99 a5 bb 6b 35 19 39 aa 5f 58 f8 c8 e8 b0 9a 15 53 a1 d0 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.470) 0:09:31.788 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.080) 0:09:31.868 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.064) 0:09:31.933 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.056) 0:09:31.990 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.081) 0:09:32.071 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.076) 0:09:32.147 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.069) 0:09:32.217 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.063) 0:09:32.280 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.079) 0:09:32.360 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:24:54 -0500 (0:00:00.103) 0:09:32.464 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.086) 0:09:32.550 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.084) 0:09:32.634 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.070) 0:09:32.705 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.044) 0:09:32.750 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.038) 0:09:32.789 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.043) 0:09:32.832 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.063) 0:09:32.895 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.082) 0:09:32.978 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.065) 0:09:33.043 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.056) 0:09:33.099 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.051) 0:09:33.151 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.042) 0:09:33.193 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.044) 0:09:33.238 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:24:55 -0500 (0:00:00.068) 0:09:33.307 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:24:56 -0500 (0:00:00.442) 0:09:33.749 **** ok: [managed-node3] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:24:56 -0500 (0:00:00.513) 0:09:34.263 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:24:56 -0500 (0:00:00.075) 0:09:34.338 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:24:56 -0500 (0:00:00.074) 0:09:34.412 **** ok: [managed-node3] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.535) 0:09:34.947 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.067) 0:09:35.015 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.067) 0:09:35.082 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.068) 0:09:35.151 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.068) 0:09:35.219 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.074) 0:09:35.294 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.082) 0:09:35.376 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.057) 0:09:35.433 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:24:57 -0500 (0:00:00.075) 0:09:35.509 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.057) 0:09:35.566 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.059) 0:09:35.625 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.096) 0:09:35.722 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.097) 0:09:35.820 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.069) 0:09:35.890 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.062) 0:09:35.953 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.057) 0:09:36.011 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.053) 0:09:36.064 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.040) 0:09:36.105 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.041) 0:09:36.146 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.052) 0:09:36.199 **** ok: [managed-node3] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.071) 0:09:36.270 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.063) 0:09:36.334 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:24:58 -0500 (0:00:00.081) 0:09:36.415 **** ok: [managed-node3] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.018824", "end": "2024-12-11 15:24:59.283084", "rc": 0, "start": "2024-12-11 15:24:59.264260" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.490) 0:09:36.906 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.073) 0:09:36.979 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.075) 0:09:37.055 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.059) 0:09:37.114 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.048) 0:09:37.163 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.048) 0:09:37.212 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.054) 0:09:37.266 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.064) 0:09:37.331 **** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.053) 0:09:37.384 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:491 Wednesday 11 December 2024 15:24:59 -0500 (0:00:00.064) 0:09:37.449 **** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.282) 0:09:37.731 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.130) 0:09:37.862 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.068) 0:09:37.931 **** skipping: [managed-node3] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node3] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node3] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node3] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.185) 0:09:38.116 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.108) 0:09:38.225 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.070) 0:09:38.295 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.071) 0:09:38.366 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Wednesday 11 December 2024 15:25:00 -0500 (0:00:00.107) 0:09:38.474 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Wednesday 11 December 2024 15:25:01 -0500 (0:00:00.198) 0:09:38.672 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Wednesday 11 December 2024 15:25:02 -0500 (0:00:01.469) 0:09:40.142 **** ok: [managed-node3] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Wednesday 11 December 2024 15:25:02 -0500 (0:00:00.085) 0:09:40.228 **** ok: [managed-node3] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Wednesday 11 December 2024 15:25:02 -0500 (0:00:00.082) 0:09:40.310 **** ok: [managed-node3] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Wednesday 11 December 2024 15:25:07 -0500 (0:00:04.392) 0:09:44.702 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node3 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Wednesday 11 December 2024 15:25:07 -0500 (0:00:00.110) 0:09:44.812 **** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Wednesday 11 December 2024 15:25:07 -0500 (0:00:00.180) 0:09:44.993 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Wednesday 11 December 2024 15:25:07 -0500 (0:00:00.083) 0:09:45.077 **** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Wednesday 11 December 2024 15:25:07 -0500 (0:00:00.092) 0:09:45.169 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Wednesday 11 December 2024 15:25:08 -0500 (0:00:01.085) 0:09:46.254 **** ok: [managed-node3] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "running", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Wednesday 11 December 2024 15:25:09 -0500 (0:00:01.183) 0:09:47.438 **** ok: [managed-node3] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Wednesday 11 December 2024 15:25:10 -0500 (0:00:00.090) 0:09:47.529 **** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Wednesday 11 December 2024 15:25:10 -0500 (0:00:00.056) 0:09:47.585 **** changed: [managed-node3] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Wednesday 11 December 2024 15:25:45 -0500 (0:00:35.046) 0:10:22.632 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Wednesday 11 December 2024 15:25:45 -0500 (0:00:00.073) 0:10:22.706 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948678.4818318, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9afe2e2b58d444cde72f5095caf10ebe860b2ff1", "ctime": 1733948678.4788318, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263645, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1733948678.4788318, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "18446744071595669004", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Wednesday 11 December 2024 15:25:45 -0500 (0:00:00.468) 0:10:23.174 **** ok: [managed-node3] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Wednesday 11 December 2024 15:25:46 -0500 (0:00:00.388) 0:10:23.563 **** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Wednesday 11 December 2024 15:25:46 -0500 (0:00:00.055) 0:10:23.619 **** ok: [managed-node3] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Wednesday 11 December 2024 15:25:46 -0500 (0:00:00.079) 0:10:23.698 **** ok: [managed-node3] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Wednesday 11 December 2024 15:25:46 -0500 (0:00:00.085) 0:10:23.784 **** ok: [managed-node3] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Wednesday 11 December 2024 15:25:46 -0500 (0:00:00.091) 0:10:23.876 **** changed: [managed-node3] => (item={u'src': u'/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Wednesday 11 December 2024 15:25:47 -0500 (0:00:00.731) 0:10:24.607 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Wednesday 11 December 2024 15:25:47 -0500 (0:00:00.591) 0:10:25.198 **** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Wednesday 11 December 2024 15:25:47 -0500 (0:00:00.113) 0:10:25.312 **** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Wednesday 11 December 2024 15:25:47 -0500 (0:00:00.059) 0:10:25.371 **** ok: [managed-node3] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Wednesday 11 December 2024 15:25:48 -0500 (0:00:00.576) 0:10:25.948 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948684.4268327, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "dabafde1eb233adfdd3a5ace38f92e9f7d1ab4d0", "ctime": 1733948680.3158321, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263819, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1733948680.314832, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "18446744071595676631", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Wednesday 11 December 2024 15:25:49 -0500 (0:00:00.613) 0:10:26.561 **** changed: [managed-node3] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-517d3f33-747e-42be-acc3-d4bf9fd40cdf", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Wednesday 11 December 2024 15:25:49 -0500 (0:00:00.445) 0:10:27.007 **** ok: [managed-node3] TASK [Verify role results] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:501 Wednesday 11 December 2024 15:25:50 -0500 (0:00:00.967) 0:10:27.975 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node3 TASK [Print out pool information] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Wednesday 11 December 2024 15:25:50 -0500 (0:00:00.201) 0:10:28.177 **** skipping: [managed-node3] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Wednesday 11 December 2024 15:25:50 -0500 (0:00:00.105) 0:10:28.283 **** ok: [managed-node3] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=eRef9R-riJV-6sGk-Efdx-8G1Q-Wfd0-uEtRuG", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Wednesday 11 December 2024 15:25:50 -0500 (0:00:00.081) 0:10:28.365 **** ok: [managed-node3] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Wednesday 11 December 2024 15:25:51 -0500 (0:00:00.514) 0:10:28.880 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002880", "end": "2024-12-11 15:25:51.834355", "rc": 0, "start": "2024-12-11 15:25:51.831475" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Wednesday 11 December 2024 15:25:51 -0500 (0:00:00.605) 0:10:29.486 **** ok: [managed-node3] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002781", "end": "2024-12-11 15:25:52.322684", "failed_when_result": false, "rc": 0, "start": "2024-12-11 15:25:52.319903" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Wednesday 11 December 2024 15:25:52 -0500 (0:00:00.470) 0:10:29.957 **** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Wednesday 11 December 2024 15:25:52 -0500 (0:00:00.051) 0:10:30.009 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node3 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Wednesday 11 December 2024 15:25:52 -0500 (0:00:00.163) 0:10:30.173 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Wednesday 11 December 2024 15:25:52 -0500 (0:00:00.070) 0:10:30.243 **** included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node3 included: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node3 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.381) 0:10:30.625 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.065) 0:10:30.690 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.072) 0:10:30.763 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.059) 0:10:30.823 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.087) 0:10:30.911 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.081) 0:10:30.992 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.122) 0:10:31.114 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.057) 0:10:31.172 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.117) 0:10:31.290 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Wednesday 11 December 2024 15:25:53 -0500 (0:00:00.120) 0:10:31.410 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.138) 0:10:31.548 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.116) 0:10:31.676 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.112) 0:10:31.788 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.088) 0:10:31.877 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.076) 0:10:31.953 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.056) 0:10:32.010 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.068) 0:10:32.079 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.057) 0:10:32.136 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.058) 0:10:32.195 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Wednesday 11 December 2024 15:25:54 -0500 (0:00:00.067) 0:10:32.263 **** ok: [managed-node3] => { "changed": false, "stat": { "atime": 1733948744.8788412, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1733948744.8788412, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 28762, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1733948744.8788412, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.499) 0:10:32.762 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.080) 0:10:32.842 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.076) 0:10:32.919 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.055) 0:10:32.975 **** ok: [managed-node3] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.097) 0:10:33.072 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.087) 0:10:33.160 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.054) 0:10:33.214 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Wednesday 11 December 2024 15:25:55 -0500 (0:00:00.091) 0:10:33.306 **** ok: [managed-node3] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Wednesday 11 December 2024 15:25:56 -0500 (0:00:01.097) 0:10:34.410 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Wednesday 11 December 2024 15:25:56 -0500 (0:00:00.072) 0:10:34.483 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.080) 0:10:34.564 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.084) 0:10:34.649 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.091) 0:10:34.740 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.064) 0:10:34.804 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.122) 0:10:34.926 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.103) 0:10:35.030 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.080) 0:10:35.110 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.070) 0:10:35.180 **** ok: [managed-node3] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.072) 0:10:35.252 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.059) 0:10:35.312 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.076) 0:10:35.389 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Wednesday 11 December 2024 15:25:57 -0500 (0:00:00.070) 0:10:35.459 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.076) 0:10:35.536 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.077) 0:10:35.614 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.074) 0:10:35.688 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.055) 0:10:35.744 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.056) 0:10:35.801 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.055) 0:10:35.856 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.058) 0:10:35.915 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.070) 0:10:35.986 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.067) 0:10:36.054 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.060) 0:10:36.114 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.057) 0:10:36.172 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.057) 0:10:36.229 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.056) 0:10:36.285 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.059) 0:10:36.344 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.065) 0:10:36.410 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Wednesday 11 December 2024 15:25:58 -0500 (0:00:00.054) 0:10:36.464 **** skipping: [managed-node3] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.057) 0:10:36.521 **** skipping: [managed-node3] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.056) 0:10:36.578 **** skipping: [managed-node3] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.059) 0:10:36.638 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.055) 0:10:36.693 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.057) 0:10:36.750 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.146) 0:10:36.897 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.057) 0:10:36.954 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.058) 0:10:37.012 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.079) 0:10:37.092 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.057) 0:10:37.149 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.055) 0:10:37.205 **** skipping: [managed-node3] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.057) 0:10:37.262 **** skipping: [managed-node3] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.058) 0:10:37.320 **** skipping: [managed-node3] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.054) 0:10:37.375 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.052) 0:10:37.427 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Wednesday 11 December 2024 15:25:59 -0500 (0:00:00.056) 0:10:37.484 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.059) 0:10:37.543 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.062) 0:10:37.606 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.056) 0:10:37.663 **** ok: [managed-node3] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.065) 0:10:37.728 **** ok: [managed-node3] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.062) 0:10:37.791 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.057) 0:10:37.848 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.058) 0:10:37.907 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.066) 0:10:37.973 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.060) 0:10:38.034 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.061) 0:10:38.096 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.069) 0:10:38.165 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.059) 0:10:38.225 **** skipping: [managed-node3] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.058) 0:10:38.284 **** ok: [managed-node3] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.059) 0:10:38.343 **** ok: [managed-node3] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node3 : ok=1224 changed=60 unreachable=0 failed=9 skipped=1064 rescued=9 ignored=0 Wednesday 11 December 2024 15:26:00 -0500 (0:00:00.032) 0:10:38.376 **** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 65.21s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 35.05s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 11.23s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.94s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.80s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.24s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.23s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.17s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.46s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.44s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.41s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.40s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.39s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.31s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.29s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.24s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.23s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.21s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.20s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.19s /tmp/collections-5Oy/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19