ansible-playbook 2.9.27 config file = /etc/ansible/ansible.cfg configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules'] ansible python module location = /usr/lib/python2.7/site-packages/ansible executable location = /usr/bin/ansible-playbook python version = 2.7.5 (default, Nov 14 2023, 16:14:06) [GCC 4.8.5 20150623 (Red Hat 4.8.5-44)] Using /etc/ansible/ansible.cfg as config file [WARNING]: running playbook inside collection fedora.linux_system_roles statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml statically imported: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml Skipping callback 'actionable', as we already have a stdout callback. Skipping callback 'counter_enabled', as we already have a stdout callback. Skipping callback 'debug', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'dense', as we already have a stdout callback. Skipping callback 'full_skip', as we already have a stdout callback. Skipping callback 'json', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'null', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. Skipping callback 'selective', as we already have a stdout callback. Skipping callback 'skippy', as we already have a stdout callback. Skipping callback 'stderr', as we already have a stdout callback. Skipping callback 'unixy', as we already have a stdout callback. Skipping callback 'yaml', as we already have a stdout callback. PLAYBOOK: tests_luks.yml ******************************************************* 1 plays in /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml PLAY [Test LUKS] *************************************************************** TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:2 Saturday 12 October 2024 20:50:05 -0400 (0:00:00.030) 0:00:00.030 ****** ok: [managed-node2] META: ran handlers TASK [Enable FIPS mode] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:20 Saturday 12 October 2024 20:50:06 -0400 (0:00:01.048) 0:00:01.079 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:24 Saturday 12 October 2024 20:50:06 -0400 (0:00:00.069) 0:00:01.149 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure dracut-fips] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:34 Saturday 12 October 2024 20:50:06 -0400 (0:00:00.069) 0:00:01.219 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Configure boot for FIPS] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:40 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.101) 0:00:01.321 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reboot] ****************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:49 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.067) 0:00:01.388 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Run the role] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:53 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.070) 0:00:01.458 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.120) 0:00:01.579 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.054) 0:00:01.634 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.057) 0:00:01.691 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.095) 0:00:01.786 ****** ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.437) 0:00:02.224 ****** ok: [managed-node2] => { "ansible_facts": { "__storage_is_ostree": false }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:50:07 -0400 (0:00:00.072) 0:00:02.297 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:50:08 -0400 (0:00:00.043) 0:00:02.340 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:50:08 -0400 (0:00:00.039) 0:00:02.379 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:50:08 -0400 (0:00:00.127) 0:00:02.507 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:50:09 -0400 (0:00:01.534) 0:00:04.041 ****** ok: [managed-node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:50:09 -0400 (0:00:00.066) 0:00:04.108 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:50:09 -0400 (0:00:00.066) 0:00:04.174 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:50:10 -0400 (0:00:00.788) 0:00:04.963 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:50:10 -0400 (0:00:00.115) 0:00:05.078 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:50:10 -0400 (0:00:00.034) 0:00:05.113 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:50:10 -0400 (0:00:00.034) 0:00:05.148 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:50:10 -0400 (0:00:00.036) 0:00:05.185 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:50:11 -0400 (0:00:00.672) 0:00:05.857 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:50:12 -0400 (0:00:01.178) 0:00:07.035 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:50:12 -0400 (0:00:00.071) 0:00:07.106 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:50:12 -0400 (0:00:00.045) 0:00:07.152 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:50:13 -0400 (0:00:00.575) 0:00:07.728 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:50:13 -0400 (0:00:00.068) 0:00:07.797 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780590.1113544, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1728780589.7193525, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780589.7193525, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:50:13 -0400 (0:00:00.330) 0:00:08.127 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:50:13 -0400 (0:00:00.040) 0:00:08.168 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:50:13 -0400 (0:00:00.040) 0:00:08.209 ****** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:50:13 -0400 (0:00:00.048) 0:00:08.257 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.044) 0:00:08.301 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.077) 0:00:08.378 ****** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.038) 0:00:08.417 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.031) 0:00:08.448 ****** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.029) 0:00:08.477 ****** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.030) 0:00:08.507 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.031) 0:00:08.539 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728779093.0713658, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.310) 0:00:08.849 ****** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:50:14 -0400 (0:00:00.037) 0:00:08.887 ****** ok: [managed-node2] TASK [Get unused disks] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:57 Saturday 12 October 2024 20:50:15 -0400 (0:00:00.647) 0:00:09.535 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml for managed-node2 TASK [Ensure test packages] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:2 Saturday 12 October 2024 20:50:15 -0400 (0:00:00.090) 0:00:09.626 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "util-linux-2.23.2-65.el7_9.1.x86_64 providing util-linux is already installed" ] } lsrpackages: util-linux TASK [Find unused disks in the system] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:11 Saturday 12 October 2024 20:50:15 -0400 (0:00:00.597) 0:00:10.223 ****** ok: [managed-node2] => { "changed": false, "disks": [ "sda" ], "info": [ "Line: NAME=\"/dev/sda\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdb\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdc\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdd\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sde\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdf\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdg\" TYPE=\"disk\" SIZE=\"1099511627776\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdh\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/sdi\" TYPE=\"disk\" SIZE=\"10737418240\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda\" TYPE=\"disk\" SIZE=\"268435456000\" FSTYPE=\"\" LOG-SEC=\"512\"", "Line: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "Line type [part] is not disk: NAME=\"/dev/xvda1\" TYPE=\"part\" SIZE=\"268434390528\" FSTYPE=\"ext4\" LOG-SEC=\"512\"", "filename [xvda1] is a partition", "Disk [/dev/xvda] attrs [{'fstype': '', 'type': 'disk', 'ssize': '512', 'size': '268435456000'}] has partitions" ] } TASK [Debug why there are no unused disks] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:20 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.501) 0:00:10.725 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set unused_disks if necessary] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:29 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.048) 0:00:10.773 ****** ok: [managed-node2] => { "ansible_facts": { "unused_disks": [ "sda" ] }, "changed": false } TASK [Exit playbook when there's not enough unused disks in the system] ******** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:34 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.055) 0:00:10.829 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Print unused disks] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/get_unused_disk.yml:39 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.036) 0:00:10.865 ****** ok: [managed-node2] => { "unused_disks": [ "sda" ] } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:66 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.042) 0:00:10.908 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.072) 0:00:10.981 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.037) 0:00:11.019 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.056) 0:00:11.075 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.070) 0:00:11.146 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.057) 0:00:11.203 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:50:16 -0400 (0:00:00.092) 0:00:11.295 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:50:17 -0400 (0:00:00.040) 0:00:11.336 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:50:17 -0400 (0:00:00.037) 0:00:11.373 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:50:17 -0400 (0:00:00.033) 0:00:11.407 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:50:17 -0400 (0:00:00.032) 0:00:11.440 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:50:17 -0400 (0:00:00.087) 0:00:11.528 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:50:18 -0400 (0:00:01.484) 0:00:13.013 ****** ok: [managed-node2] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:50:18 -0400 (0:00:00.059) 0:00:13.072 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:50:18 -0400 (0:00:00.062) 0:00:13.135 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:50:22 -0400 (0:00:03.980) 0:00:17.115 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:50:22 -0400 (0:00:00.161) 0:00:17.276 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:50:23 -0400 (0:00:00.106) 0:00:17.383 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:50:23 -0400 (0:00:00.105) 0:00:17.488 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:50:23 -0400 (0:00:00.083) 0:00:17.572 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:50:24 -0400 (0:00:00.782) 0:00:18.354 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:50:25 -0400 (0:00:01.097) 0:00:19.451 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:50:25 -0400 (0:00:00.071) 0:00:19.523 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:50:25 -0400 (0:00:00.050) 0:00:19.573 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'foo' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:50:29 -0400 (0:00:03.885) 0:00:23.458 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'foo' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.091) 0:00:23.550 ****** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.040) 0:00:23.590 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.050) 0:00:23.641 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.060) 0:00:23.701 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted disk volume w/ default fs] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:81 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.046) 0:00:23.748 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.089) 0:00:23.837 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.050) 0:00:23.888 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.039) 0:00:23.927 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.087) 0:00:24.015 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.042) 0:00:24.057 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.049) 0:00:24.107 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.052) 0:00:24.160 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.040) 0:00:24.200 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:50:29 -0400 (0:00:00.090) 0:00:24.291 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:50:31 -0400 (0:00:01.244) 0:00:25.536 ****** ok: [managed-node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:50:31 -0400 (0:00:00.035) 0:00:25.571 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:50:31 -0400 (0:00:00.037) 0:00:25.609 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:50:35 -0400 (0:00:03.895) 0:00:29.504 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:50:35 -0400 (0:00:00.096) 0:00:29.601 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:50:35 -0400 (0:00:00.050) 0:00:29.651 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:50:35 -0400 (0:00:00.049) 0:00:29.701 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:50:35 -0400 (0:00:00.061) 0:00:29.762 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:50:36 -0400 (0:00:00.701) 0:00:30.463 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:50:37 -0400 (0:00:01.120) 0:00:31.584 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:50:37 -0400 (0:00:00.092) 0:00:31.676 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:50:37 -0400 (0:00:00.052) 0:00:31.729 ****** changed: [managed-node2] => { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:50:47 -0400 (0:00:09.964) 0:00:41.693 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:50:47 -0400 (0:00:00.059) 0:00:41.753 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780590.1113544, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "72884e3f126482c2d28276ff7c57744fa95eff91", "ctime": 1728780589.7193525, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780589.7193525, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1229, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:50:47 -0400 (0:00:00.390) 0:00:42.143 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:50:48 -0400 (0:00:00.652) 0:00:42.796 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:50:48 -0400 (0:00:00.072) 0:00:42.869 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:50:48 -0400 (0:00:00.068) 0:00:42.938 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:50:48 -0400 (0:00:00.057) 0:00:42.995 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:50:48 -0400 (0:00:00.059) 0:00:43.055 ****** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:50:48 -0400 (0:00:00.036) 0:00:43.091 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:50:49 -0400 (0:00:00.848) 0:00:43.940 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:50:50 -0400 (0:00:00.620) 0:00:44.560 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:50:50 -0400 (0:00:00.063) 0:00:44.624 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:50:50 -0400 (0:00:00.492) 0:00:45.117 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728779093.0713658, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1718879272.062, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 131079, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1718879026.308, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "18446744072852913879", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:50:51 -0400 (0:00:00.541) 0:00:45.659 ****** changed: [managed-node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-f18dd295-4276-4ac6-bb7e-4f20358298fe', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:50:51 -0400 (0:00:00.352) 0:00:46.011 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:93 Saturday 12 October 2024 20:50:52 -0400 (0:00:00.784) 0:00:46.795 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:50:52 -0400 (0:00:00.072) 0:00:46.868 ****** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:50:52 -0400 (0:00:00.046) 0:00:46.915 ****** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:50:52 -0400 (0:00:00.084) 0:00:46.999 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "size": "10G", "type": "crypt", "uuid": "b91752e1-178c-45c4-a471-39e998d8ac8b" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "f18dd295-4276-4ac6-bb7e-4f20358298fe" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:50:54 -0400 (0:00:01.714) 0:00:48.714 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002517", "end": "2024-10-12 20:50:54.928181", "rc": 0, "start": "2024-10-12 20:50:54.925664" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:50:55 -0400 (0:00:00.623) 0:00:49.338 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002509", "end": "2024-10-12 20:50:55.368609", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:50:55.366100" } STDOUT: luks-f18dd295-4276-4ac6-bb7e-4f20358298fe /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:50:55 -0400 (0:00:00.405) 0:00:49.743 ****** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:50:55 -0400 (0:00:00.046) 0:00:49.789 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:50:55 -0400 (0:00:00.104) 0:00:49.894 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:50:55 -0400 (0:00:00.065) 0:00:49.960 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:50:55 -0400 (0:00:00.274) 0:00:50.235 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.101) 0:00:50.337 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.100) 0:00:50.437 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.052) 0:00:50.490 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.059) 0:00:50.549 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.096) 0:00:50.646 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.091) 0:00:50.737 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.062) 0:00:50.799 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.075) 0:00:50.875 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.048) 0:00:50.924 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.047) 0:00:50.972 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.064) 0:00:51.037 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.095) 0:00:51.132 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.096) 0:00:51.229 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:50:56 -0400 (0:00:00.059) 0:00:51.288 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.051) 0:00:51.340 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.070) 0:00:51.411 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.097) 0:00:51.509 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.066) 0:00:51.575 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.071) 0:00:51.646 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780647.1476333, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780647.1476333, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27625, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728780647.1476333, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.363) 0:00:52.009 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.054) 0:00:52.063 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.050) 0:00:52.114 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.059) 0:00:52.173 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:50:57 -0400 (0:00:00.105) 0:00:52.279 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:50:58 -0400 (0:00:00.049) 0:00:52.328 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:50:58 -0400 (0:00:00.067) 0:00:52.396 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780647.2546337, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780647.2546337, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 306448, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780647.2546337, "nlink": 1, "path": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:50:58 -0400 (0:00:00.361) 0:00:52.758 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:50:59 -0400 (0:00:00.641) 0:00:53.400 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.722028", "end": "2024-10-12 20:51:00.068246", "rc": 0, "start": "2024-10-12 20:50:59.346218" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 70 4d 19 6b b2 59 8e ff 50 7b e9 19 d8 64 5b 2e 67 3c 7c b3 MK salt: b5 0e a1 e7 2f 2e 77 38 20 a3 78 2f 09 1d 07 d0 fc 75 8c 47 b3 76 93 e8 a6 e3 c5 74 9b 5a 69 40 MK iterations: 24023 UUID: f18dd295-4276-4ac6-bb7e-4f20358298fe Key Slot 0: ENABLED Iterations: 384374 Salt: 0a 89 2b 43 6a 3d d6 26 4d 62 28 b9 db b0 e9 41 81 41 9d c2 50 2e 4c 68 27 82 6b ac 0d 1f 07 04 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:51:00 -0400 (0:00:01.027) 0:00:54.427 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.043) 0:00:54.470 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.052) 0:00:54.523 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.041) 0:00:54.564 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.039) 0:00:54.603 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.033) 0:00:54.637 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.035) 0:00:54.672 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.033) 0:00:54.705 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.039) 0:00:54.745 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.041) 0:00:54.787 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.044) 0:00:54.831 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.047) 0:00:54.878 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.047) 0:00:54.926 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.036) 0:00:54.962 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.035) 0:00:54.998 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.035) 0:00:55.034 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.035) 0:00:55.069 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.035) 0:00:55.104 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.036) 0:00:55.141 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.034) 0:00:55.175 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.033) 0:00:55.209 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.033) 0:00:55.242 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:51:00 -0400 (0:00:00.032) 0:00:55.275 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.032) 0:00:55.307 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.036) 0:00:55.344 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.034) 0:00:55.379 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.033) 0:00:55.413 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.034) 0:00:55.447 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.035) 0:00:55.482 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.046) 0:00:55.529 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.063) 0:00:55.592 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.054) 0:00:55.647 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.046) 0:00:55.693 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.041) 0:00:55.735 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.040) 0:00:55.775 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.039) 0:00:55.815 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.041) 0:00:55.856 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.033) 0:00:55.889 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.032) 0:00:55.922 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.038) 0:00:55.961 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.045) 0:00:56.006 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.051) 0:00:56.058 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.054) 0:00:56.112 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.052) 0:00:56.165 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.053) 0:00:56.218 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:51:01 -0400 (0:00:00.052) 0:00:56.270 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.051) 0:00:56.322 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.051) 0:00:56.374 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.046) 0:00:56.421 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.090) 0:00:56.511 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.047) 0:00:56.558 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.045) 0:00:56.604 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.033) 0:00:56.638 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.032) 0:00:56.671 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.033) 0:00:56.704 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.034) 0:00:56.739 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.037) 0:00:56.776 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.044) 0:00:56.821 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.054) 0:00:56.875 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 12 October 2024 20:51:02 -0400 (0:00:00.041) 0:00:56.916 ****** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:99 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.434) 0:00:57.351 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.090) 0:00:57.441 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.041) 0:00:57.482 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.050) 0:00:57.532 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.053) 0:00:57.586 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.052) 0:00:57.638 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.107) 0:00:57.745 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.042) 0:00:57.787 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.039) 0:00:57.827 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.041) 0:00:57.869 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.040) 0:00:57.910 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:51:03 -0400 (0:00:00.081) 0:00:57.991 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:51:04 -0400 (0:00:01.298) 0:00:59.290 ****** ok: [managed-node2] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:51:05 -0400 (0:00:00.086) 0:00:59.376 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:51:05 -0400 (0:00:00.069) 0:00:59.446 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:51:09 -0400 (0:00:03.965) 0:01:03.411 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:51:09 -0400 (0:00:00.104) 0:01:03.516 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:51:09 -0400 (0:00:00.049) 0:01:03.566 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:51:09 -0400 (0:00:00.049) 0:01:03.615 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:51:09 -0400 (0:00:00.053) 0:01:03.669 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:51:10 -0400 (0:00:00.879) 0:01:04.548 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:51:11 -0400 (0:00:01.165) 0:01:05.714 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:51:11 -0400 (0:00:00.092) 0:01:05.807 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:51:11 -0400 (0:00:00.056) 0:01:05.863 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-f18dd295-4276-4ac6-bb7e-4f20358298fe' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:51:15 -0400 (0:00:04.191) 0:01:10.055 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10733223936, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-f18dd295-4276-4ac6-bb7e-4f20358298fe' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:51:15 -0400 (0:00:00.047) 0:01:10.102 ****** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:51:15 -0400 (0:00:00.035) 0:01:10.138 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:51:15 -0400 (0:00:00.042) 0:01:10.181 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:51:15 -0400 (0:00:00.048) 0:01:10.230 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 12 October 2024 20:51:15 -0400 (0:00:00.032) 0:01:10.262 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780662.9897106, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780662.9897106, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1728780662.9897106, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071681991108", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.336) 0:01:10.599 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:119 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.059) 0:01:10.659 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.159) 0:01:10.818 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.083) 0:01:10.902 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.061) 0:01:10.963 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.124) 0:01:11.088 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.054) 0:01:11.142 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.067) 0:01:11.210 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:51:16 -0400 (0:00:00.050) 0:01:11.260 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:51:17 -0400 (0:00:00.053) 0:01:11.314 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:51:17 -0400 (0:00:00.118) 0:01:11.433 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:51:18 -0400 (0:00:01.457) 0:01:12.891 ****** ok: [managed-node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:51:18 -0400 (0:00:00.066) 0:01:12.958 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:51:18 -0400 (0:00:00.082) 0:01:13.040 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:51:22 -0400 (0:00:04.006) 0:01:17.047 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:51:22 -0400 (0:00:00.078) 0:01:17.125 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:51:22 -0400 (0:00:00.038) 0:01:17.164 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:51:22 -0400 (0:00:00.040) 0:01:17.205 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:51:22 -0400 (0:00:00.075) 0:01:17.281 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:51:23 -0400 (0:00:00.654) 0:01:17.935 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:51:24 -0400 (0:00:01.019) 0:01:18.955 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:51:24 -0400 (0:00:00.062) 0:01:19.018 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:51:24 -0400 (0:00:00.041) 0:01:19.059 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:51:29 -0400 (0:00:04.459) 0:01:23.518 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:51:29 -0400 (0:00:00.054) 0:01:23.573 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780650.1416478, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "03c723dc747c6ee8299a1e83c2985460cc66f9fa", "ctime": 1728780650.1386478, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780650.1386478, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:51:29 -0400 (0:00:00.417) 0:01:23.990 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:51:30 -0400 (0:00:00.383) 0:01:24.373 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:51:30 -0400 (0:00:00.049) 0:01:24.423 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:51:30 -0400 (0:00:00.103) 0:01:24.526 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:51:30 -0400 (0:00:00.108) 0:01:24.635 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:51:30 -0400 (0:00:00.073) 0:01:24.709 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-f18dd295-4276-4ac6-bb7e-4f20358298fe" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:51:30 -0400 (0:00:00.450) 0:01:25.160 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:51:31 -0400 (0:00:00.658) 0:01:25.818 ****** changed: [managed-node2] => (item={u'src': u'UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:51:32 -0400 (0:00:00.548) 0:01:26.367 ****** skipping: [managed-node2] => (item={u'src': u'UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:51:32 -0400 (0:00:00.069) 0:01:26.437 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:51:32 -0400 (0:00:00.776) 0:01:27.213 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780655.3666735, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "03ebbaa6a645fee7cacced9af3a072eade15cbb8", "ctime": 1728780651.614655, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728780651.613655, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1741553868", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:51:33 -0400 (0:00:00.539) 0:01:27.753 ****** changed: [managed-node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-f18dd295-4276-4ac6-bb7e-4f20358298fe', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:51:34 -0400 (0:00:00.626) 0:01:28.380 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:132 Saturday 12 October 2024 20:51:34 -0400 (0:00:00.723) 0:01:29.103 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:51:34 -0400 (0:00:00.133) 0:01:29.236 ****** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:51:35 -0400 (0:00:00.086) 0:01:29.323 ****** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_kernel_device": "/dev/sda", "_mount_id": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10733223936, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:51:35 -0400 (0:00:00.071) 0:01:29.394 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "0425a579-170a-4f56-aa40-0ae27b6ced9e" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:51:35 -0400 (0:00:00.617) 0:01:30.012 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002729", "end": "2024-10-12 20:51:36.039239", "rc": 0, "start": "2024-10-12 20:51:36.036510" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:51:36 -0400 (0:00:00.490) 0:01:30.503 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002846", "end": "2024-10-12 20:51:36.479293", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:51:36.476447" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:51:36 -0400 (0:00:00.358) 0:01:30.862 ****** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:51:36 -0400 (0:00:00.050) 0:01:30.912 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:51:36 -0400 (0:00:00.112) 0:01:31.025 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:51:36 -0400 (0:00:00.059) 0:01:31.085 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.312) 0:01:31.397 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.086) 0:01:31.484 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.063) 0:01:31.547 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.052) 0:01:31.600 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.117) 0:01:31.717 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.053) 0:01:31.770 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.050) 0:01:31.821 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.051) 0:01:31.872 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.058) 0:01:31.931 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.055) 0:01:31.986 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.052) 0:01:32.039 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.053) 0:01:32.092 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.087) 0:01:32.180 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:51:37 -0400 (0:00:00.065) 0:01:32.245 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:51:38 -0400 (0:00:00.071) 0:01:32.316 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:51:38 -0400 (0:00:00.058) 0:01:32.375 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:51:38 -0400 (0:00:00.057) 0:01:32.433 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:51:38 -0400 (0:00:00.052) 0:01:32.485 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:51:38 -0400 (0:00:00.124) 0:01:32.610 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:51:38 -0400 (0:00:00.178) 0:01:32.789 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780689.063838, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780689.063838, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27625, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728780689.063838, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.655) 0:01:33.444 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.062) 0:01:33.507 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.048) 0:01:33.555 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.058) 0:01:33.614 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.066) 0:01:33.680 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.064) 0:01:33.744 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.117) 0:01:33.862 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:51:39 -0400 (0:00:00.109) 0:01:33.971 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.937) 0:01:34.909 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.049) 0:01:34.958 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.048) 0:01:35.007 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.064) 0:01:35.072 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.047) 0:01:35.119 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.048) 0:01:35.167 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.050) 0:01:35.217 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:51:40 -0400 (0:00:00.051) 0:01:35.269 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.048) 0:01:35.318 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.057) 0:01:35.375 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.066) 0:01:35.441 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.069) 0:01:35.511 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.078) 0:01:35.590 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.053) 0:01:35.643 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.064) 0:01:35.707 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.050) 0:01:35.758 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.047) 0:01:35.806 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.065) 0:01:35.872 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.053) 0:01:35.926 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.055) 0:01:35.981 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.106) 0:01:36.088 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.050) 0:01:36.138 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.044) 0:01:36.183 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.037) 0:01:36.221 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.032) 0:01:36.253 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:51:41 -0400 (0:00:00.034) 0:01:36.287 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.035) 0:01:36.323 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.035) 0:01:36.358 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.034) 0:01:36.392 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.034) 0:01:36.427 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.035) 0:01:36.462 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.041) 0:01:36.504 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.062) 0:01:36.566 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.069) 0:01:36.635 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.047) 0:01:36.683 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.050) 0:01:36.733 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.046) 0:01:36.780 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.049) 0:01:36.829 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.050) 0:01:36.880 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.048) 0:01:36.929 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.040) 0:01:36.969 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.043) 0:01:37.013 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.043) 0:01:37.056 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.041) 0:01:37.097 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.035) 0:01:37.133 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.032) 0:01:37.166 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.032) 0:01:37.198 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.032) 0:01:37.231 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:51:42 -0400 (0:00:00.032) 0:01:37.263 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.037) 0:01:37.301 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.038) 0:01:37.339 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.036) 0:01:37.375 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.041) 0:01:37.417 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.051) 0:01:37.468 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.042) 0:01:37.511 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.049) 0:01:37.560 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.056) 0:01:37.617 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.070) 0:01:37.688 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.041) 0:01:37.730 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.039) 0:01:37.769 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.033) 0:01:37.802 ****** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:138 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.295) 0:01:38.097 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.077) 0:01:38.174 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.041) 0:01:38.216 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:51:43 -0400 (0:00:00.051) 0:01:38.267 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.052) 0:01:38.320 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.041) 0:01:38.361 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.078) 0:01:38.440 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.041) 0:01:38.482 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.046) 0:01:38.529 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.049) 0:01:38.578 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.047) 0:01:38.625 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:51:44 -0400 (0:00:00.127) 0:01:38.753 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:51:45 -0400 (0:00:01.271) 0:01:40.025 ****** ok: [managed-node2] => { "storage_pools": [] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:51:45 -0400 (0:00:00.072) 0:01:40.097 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:51:45 -0400 (0:00:00.055) 0:01:40.153 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:51:49 -0400 (0:00:03.817) 0:01:43.970 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:51:49 -0400 (0:00:00.093) 0:01:44.064 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:51:49 -0400 (0:00:00.046) 0:01:44.111 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:51:49 -0400 (0:00:00.045) 0:01:44.156 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:51:49 -0400 (0:00:00.037) 0:01:44.194 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:51:50 -0400 (0:00:00.650) 0:01:44.844 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service": { "name": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:51:51 -0400 (0:00:00.993) 0:01:45.838 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:51:51 -0400 (0:00:00.072) 0:01:45.910 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2df18dd295\x2d4276\x2d4ac6\x2dbb7e\x2d4f20358298fe.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "name": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket dev-sda.device cryptsetup-pre.target systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-f18dd295-4276-4ac6-bb7e-4f20358298fe", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-f18dd295-4276-4ac6-bb7e-4f20358298fe /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-f18dd295-4276-4ac6-bb7e-4f20358298fe ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:51:52 -0400 (0:00:00.535) 0:01:46.446 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:51:56 -0400 (0:00:04.074) 0:01:50.520 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [], u'volumes': [{u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 10737418240, u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'fs_overwrite_existing': True, u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'mount_user': None, u'raid_spare_count': None, u'cache_mode': None, u'name': u'foo', u'mount_group': None, u'type': u'disk', u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:51:56 -0400 (0:00:00.086) 0:01:50.607 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2df18dd295\x2d4276\x2d4ac6\x2dbb7e\x2d4f20358298fe.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "name": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2df18dd295\\x2d4276\\x2d4ac6\\x2dbb7e\\x2d4f20358298fe.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:51:56 -0400 (0:00:00.588) 0:01:51.195 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:51:56 -0400 (0:00:00.055) 0:01:51.250 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:51:57 -0400 (0:00:00.096) 0:01:51.347 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 12 October 2024 20:51:57 -0400 (0:00:00.076) 0:01:51.424 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780703.7419088, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780703.7419088, "dev": 2048, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1728780703.7419088, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "134592977", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 12 October 2024 20:51:57 -0400 (0:00:00.487) 0:01:51.911 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:158 Saturday 12 October 2024 20:51:57 -0400 (0:00:00.066) 0:01:51.978 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:51:57 -0400 (0:00:00.231) 0:01:52.210 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:51:57 -0400 (0:00:00.083) 0:01:52.293 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.049) 0:01:52.342 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.175) 0:01:52.517 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.045) 0:01:52.563 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.037) 0:01:52.601 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.036) 0:01:52.638 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.043) 0:01:52.681 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:51:58 -0400 (0:00:00.120) 0:01:52.802 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:51:59 -0400 (0:00:01.368) 0:01:54.171 ****** ok: [managed-node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:51:59 -0400 (0:00:00.060) 0:01:54.231 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "foo", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:52:00 -0400 (0:00:00.139) 0:01:54.371 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:52:04 -0400 (0:00:03.931) 0:01:58.303 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:52:04 -0400 (0:00:00.175) 0:01:58.479 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:52:04 -0400 (0:00:00.072) 0:01:58.551 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:52:04 -0400 (0:00:00.065) 0:01:58.617 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:52:04 -0400 (0:00:00.045) 0:01:58.663 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:52:05 -0400 (0:00:00.758) 0:01:59.422 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:52:06 -0400 (0:00:00.964) 0:02:00.387 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:52:06 -0400 (0:00:00.075) 0:02:00.463 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:52:06 -0400 (0:00:00.049) 0:02:00.512 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b040dea9-044f-4b42-a658-67b331e77c81", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:52:16 -0400 (0:00:10.044) 0:02:10.557 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:52:16 -0400 (0:00:00.058) 0:02:10.616 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780691.920852, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "527ea34b12de2e4df694e3bee02f680d7a71b268", "ctime": 1728780691.918852, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780691.918852, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:52:16 -0400 (0:00:00.375) 0:02:10.991 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:52:17 -0400 (0:00:00.559) 0:02:11.551 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:52:17 -0400 (0:00:00.051) 0:02:11.603 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b040dea9-044f-4b42-a658-67b331e77c81", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:52:17 -0400 (0:00:00.094) 0:02:11.698 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:52:17 -0400 (0:00:00.076) 0:02:11.775 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:52:17 -0400 (0:00:00.072) 0:02:11.847 ****** changed: [managed-node2] => (item={u'src': u'UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=0425a579-170a-4f56-aa40-0ae27b6ced9e" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:52:17 -0400 (0:00:00.370) 0:02:12.218 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:52:18 -0400 (0:00:00.499) 0:02:12.718 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:52:18 -0400 (0:00:00.354) 0:02:13.073 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:52:18 -0400 (0:00:00.045) 0:02:13.118 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:52:19 -0400 (0:00:00.439) 0:02:13.558 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780696.477874, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780693.9548619, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1728780693.953862, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1741554027", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:52:19 -0400 (0:00:00.360) 0:02:13.919 ****** changed: [managed-node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-b040dea9-044f-4b42-a658-67b331e77c81', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-b040dea9-044f-4b42-a658-67b331e77c81", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:52:20 -0400 (0:00:00.391) 0:02:14.311 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:171 Saturday 12 October 2024 20:52:20 -0400 (0:00:00.677) 0:02:14.989 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:52:20 -0400 (0:00:00.107) 0:02:15.097 ****** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:52:20 -0400 (0:00:00.042) 0:02:15.139 ****** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "_raw_device": "/dev/sda", "_raw_kernel_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "present", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:52:20 -0400 (0:00:00.079) 0:02:15.220 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "size": "10G", "type": "crypt", "uuid": "2f218006-5547-4149-b0ef-b1da7bbd450b" }, "/dev/sda": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "b040dea9-044f-4b42-a658-67b331e77c81" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:52:21 -0400 (0:00:00.538) 0:02:15.758 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002691", "end": "2024-10-12 20:52:21.715125", "rc": 0, "start": "2024-10-12 20:52:21.712434" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:52:21 -0400 (0:00:00.371) 0:02:16.129 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002527", "end": "2024-10-12 20:52:22.168007", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:52:22.165480" } STDOUT: luks-b040dea9-044f-4b42-a658-67b331e77c81 /dev/sda - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.399) 0:02:16.529 ****** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.043) 0:02:16.573 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.150) 0:02:16.723 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.090) 0:02:16.814 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.265) 0:02:17.080 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.059) 0:02:17.139 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.093) 0:02:17.233 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:52:22 -0400 (0:00:00.050) 0:02:17.284 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.065) 0:02:17.349 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.076) 0:02:17.426 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.043) 0:02:17.470 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.048) 0:02:17.518 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.050) 0:02:17.569 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.050) 0:02:17.620 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.054) 0:02:17.674 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.054) 0:02:17.729 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.087) 0:02:17.816 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.064) 0:02:17.881 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.064) 0:02:17.945 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.051) 0:02:17.997 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.083) 0:02:18.081 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.071) 0:02:18.153 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:52:23 -0400 (0:00:00.067) 0:02:18.220 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.130) 0:02:18.351 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780736.0100625, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780736.0100625, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27625, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728780736.0100625, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.444) 0:02:18.796 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.067) 0:02:18.863 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.063) 0:02:18.927 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.085) 0:02:19.012 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.063) 0:02:19.076 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.075) 0:02:19.152 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:52:24 -0400 (0:00:00.063) 0:02:19.215 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780736.126063, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780736.126063, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 324699, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780736.126063, "nlink": 1, "path": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:52:25 -0400 (0:00:00.518) 0:02:19.733 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:52:26 -0400 (0:00:00.863) 0:02:20.596 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda" ], "delta": "0:00:00.026152", "end": "2024-10-12 20:52:26.769618", "rc": 0, "start": "2024-10-12 20:52:26.743466" } STDOUT: LUKS header information for /dev/sda Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 77 da dd f2 3a f4 db 81 7e 69 c3 d8 26 39 31 6c 1e 97 e3 d3 MK salt: 0f 6c cd c6 b8 ec 59 39 a5 98 86 c0 ed da 96 d7 a6 0b d3 7f 5d fa 97 5f f7 bd 09 c7 a6 ea cc 79 MK iterations: 23918 UUID: b040dea9-044f-4b42-a658-67b331e77c81 Key Slot 0: ENABLED Iterations: 382134 Salt: 1b 90 e1 f3 ef 86 08 27 0e cb 14 2e 0c 3b 98 90 40 d8 5d 1e bb 42 6e b8 30 f4 b3 03 72 95 65 01 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:52:26 -0400 (0:00:00.615) 0:02:21.212 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:52:26 -0400 (0:00:00.073) 0:02:21.286 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.084) 0:02:21.370 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.139) 0:02:21.510 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.090) 0:02:21.600 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.064) 0:02:21.665 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.110) 0:02:21.776 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.082) 0:02:21.858 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-b040dea9-044f-4b42-a658-67b331e77c81 /dev/sda -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.123) 0:02:21.981 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.092) 0:02:22.073 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.095) 0:02:22.168 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:52:27 -0400 (0:00:00.088) 0:02:22.257 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.120) 0:02:22.377 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.051) 0:02:22.429 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.060) 0:02:22.490 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.085) 0:02:22.576 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.068) 0:02:22.644 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.108) 0:02:22.752 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.120) 0:02:22.873 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.067) 0:02:22.941 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.053) 0:02:22.994 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.049) 0:02:23.044 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.075) 0:02:23.120 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.053) 0:02:23.174 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:52:28 -0400 (0:00:00.102) 0:02:23.277 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.078) 0:02:23.356 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.066) 0:02:23.422 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.064) 0:02:23.487 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.124) 0:02:23.611 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.118) 0:02:23.730 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.112) 0:02:23.842 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.129) 0:02:23.972 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.098) 0:02:24.071 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.094) 0:02:24.165 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:52:29 -0400 (0:00:00.089) 0:02:24.255 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.070) 0:02:24.326 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.073) 0:02:24.399 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.058) 0:02:24.458 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.052) 0:02:24.511 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.052) 0:02:24.563 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.061) 0:02:24.625 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.049) 0:02:24.674 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.052) 0:02:24.727 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.053) 0:02:24.780 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.095) 0:02:24.876 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.103) 0:02:24.980 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.115) 0:02:25.096 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.109) 0:02:25.205 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:52:30 -0400 (0:00:00.075) 0:02:25.280 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.127) 0:02:25.408 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.067) 0:02:25.475 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.101) 0:02:25.577 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.099) 0:02:25.676 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.081) 0:02:25.758 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.075) 0:02:25.834 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.053) 0:02:25.887 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.049) 0:02:25.937 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.052) 0:02:25.989 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.062) 0:02:26.052 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:178 Saturday 12 October 2024 20:52:31 -0400 (0:00:00.051) 0:02:26.104 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.204) 0:02:26.308 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.069) 0:02:26.378 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.143) 0:02:26.521 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.131) 0:02:26.652 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.076) 0:02:26.729 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.125) 0:02:26.854 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.127) 0:02:26.981 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.062) 0:02:27.044 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.051) 0:02:27.095 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:52:32 -0400 (0:00:00.093) 0:02:27.189 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:52:33 -0400 (0:00:00.253) 0:02:27.442 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:52:39 -0400 (0:00:06.326) 0:02:33.769 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:52:39 -0400 (0:00:00.060) 0:02:33.829 ****** ok: [managed-node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:52:39 -0400 (0:00:00.057) 0:02:33.887 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:52:43 -0400 (0:00:03.926) 0:02:37.813 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:52:43 -0400 (0:00:00.105) 0:02:37.919 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:52:43 -0400 (0:00:00.102) 0:02:38.022 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:52:43 -0400 (0:00:00.068) 0:02:38.090 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:52:43 -0400 (0:00:00.050) 0:02:38.141 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:52:44 -0400 (0:00:00.713) 0:02:38.854 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:52:45 -0400 (0:00:01.138) 0:02:39.992 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:52:45 -0400 (0:00:00.082) 0:02:40.075 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:52:45 -0400 (0:00:00.058) 0:02:40.134 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:52:49 -0400 (0:00:04.028) 0:02:44.162 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:52:49 -0400 (0:00:00.068) 0:02:44.230 ****** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:52:49 -0400 (0:00:00.044) 0:02:44.275 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.054) 0:02:44.330 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.067) 0:02:44.397 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted partition volume w/ default fs] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:197 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.043) 0:02:44.441 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.118) 0:02:44.560 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.053) 0:02:44.614 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.040) 0:02:44.655 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.078) 0:02:44.733 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.032) 0:02:44.765 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.033) 0:02:44.799 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.032) 0:02:44.832 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.034) 0:02:44.866 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:52:50 -0400 (0:00:00.083) 0:02:44.949 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:52:51 -0400 (0:00:01.303) 0:02:46.253 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:52:51 -0400 (0:00:00.046) 0:02:46.299 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:52:52 -0400 (0:00:00.052) 0:02:46.352 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:52:55 -0400 (0:00:03.917) 0:02:50.269 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:52:56 -0400 (0:00:00.078) 0:02:50.347 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:52:56 -0400 (0:00:00.037) 0:02:50.385 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:52:56 -0400 (0:00:00.033) 0:02:50.418 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:52:56 -0400 (0:00:00.030) 0:02:50.449 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:52:56 -0400 (0:00:00.814) 0:02:51.263 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:52:57 -0400 (0:00:00.954) 0:02:52.218 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:52:57 -0400 (0:00:00.079) 0:02:52.297 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:52:58 -0400 (0:00:00.048) 0:02:52.345 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b040dea9-044f-4b42-a658-67b331e77c81", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:53:08 -0400 (0:00:10.625) 0:03:02.970 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:53:08 -0400 (0:00:00.050) 0:03:03.021 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780738.6930754, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "5822710abf9da9c5ef4f4f4dd90f79a266a00df7", "ctime": 1728780738.6910753, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780738.6910753, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:53:09 -0400 (0:00:00.367) 0:03:03.389 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:53:09 -0400 (0:00:00.360) 0:03:03.749 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:53:09 -0400 (0:00:00.046) 0:03:03.796 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create device", "device": "/dev/sda1", "fs_type": null }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda", "name": "luks-b040dea9-044f-4b42-a658-67b331e77c81", "password": "-", "state": "absent" }, { "backing_device": "/dev/sda1", "name": "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:53:09 -0400 (0:00:00.075) 0:03:03.872 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:53:09 -0400 (0:00:00.062) 0:03:03.935 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:53:09 -0400 (0:00:00.058) 0:03:03.993 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-b040dea9-044f-4b42-a658-67b331e77c81" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:53:10 -0400 (0:00:00.361) 0:03:04.354 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:53:10 -0400 (0:00:00.466) 0:03:04.821 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:53:10 -0400 (0:00:00.364) 0:03:05.186 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:53:10 -0400 (0:00:00.043) 0:03:05.229 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:53:11 -0400 (0:00:00.452) 0:03:05.681 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780742.1670918, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c069ac3fe7f9ed5253d70ba66bfd6b95137d3873", "ctime": 1728780739.9350812, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728780739.9350812, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 53, "uid": 0, "version": "1741554187", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:53:11 -0400 (0:00:00.416) 0:03:06.098 ****** changed: [managed-node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-b040dea9-044f-4b42-a658-67b331e77c81', u'backing_device': u'/dev/sda'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda", "name": "luks-b040dea9-044f-4b42-a658-67b331e77c81", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:53:12 -0400 (0:00:00.670) 0:03:06.768 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:214 Saturday 12 October 2024 20:53:13 -0400 (0:00:00.670) 0:03:07.438 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:53:13 -0400 (0:00:00.106) 0:03:07.545 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:53:13 -0400 (0:00:00.064) 0:03:07.609 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:53:13 -0400 (0:00:00.050) 0:03:07.659 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "size": "10G", "type": "crypt", "uuid": "7b965fac-1459-4c4a-a7d9-769737bdf1e2" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "69072daa-a86c-4059-9dc7-9ff32c5d8593" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:53:13 -0400 (0:00:00.373) 0:03:08.033 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002764", "end": "2024-10-12 20:53:14.005432", "rc": 0, "start": "2024-10-12 20:53:14.002668" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.353) 0:03:08.386 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002591", "end": "2024-10-12 20:53:14.365782", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:53:14.363191" } STDOUT: luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 /dev/sda1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.352) 0:03:08.739 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.078) 0:03:08.818 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.089) 0:03:08.907 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.050) 0:03:08.958 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.051) 0:03:09.010 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.130) 0:03:09.140 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.055) 0:03:09.196 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.054) 0:03:09.250 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:53:14 -0400 (0:00:00.049) 0:03:09.300 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.049) 0:03:09.350 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.053) 0:03:09.403 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.054) 0:03:09.457 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.051) 0:03:09.509 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.051) 0:03:09.560 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.052) 0:03:09.613 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.259) 0:03:09.873 ****** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.058) 0:03:09.931 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.125) 0:03:10.057 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.054) 0:03:10.111 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.057) 0:03:10.169 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.055) 0:03:10.225 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:53:15 -0400 (0:00:00.054) 0:03:10.279 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.052) 0:03:10.332 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.052) 0:03:10.384 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.051) 0:03:10.436 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.054) 0:03:10.491 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.052) 0:03:10.544 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.054) 0:03:10.598 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.052) 0:03:10.651 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.108) 0:03:10.760 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.057) 0:03:10.817 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.087) 0:03:10.904 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.048) 0:03:10.952 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.090) 0:03:11.042 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.039) 0:03:11.082 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.034) 0:03:11.116 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.036) 0:03:11.153 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.033) 0:03:11.186 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:53:16 -0400 (0:00:00.078) 0:03:11.265 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.046) 0:03:11.311 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.086) 0:03:11.397 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.033) 0:03:11.431 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.032) 0:03:11.463 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.034) 0:03:11.498 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.032) 0:03:11.530 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.033) 0:03:11.564 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.033) 0:03:11.597 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.034) 0:03:11.632 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.115) 0:03:11.747 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.042) 0:03:11.789 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.169) 0:03:11.959 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.040) 0:03:12.000 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.054) 0:03:12.055 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.036) 0:03:12.091 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.040) 0:03:12.132 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.033) 0:03:12.165 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.032) 0:03:12.198 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.032) 0:03:12.231 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.033) 0:03:12.264 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:53:17 -0400 (0:00:00.035) 0:03:12.299 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.032) 0:03:12.332 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.034) 0:03:12.366 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.057) 0:03:12.423 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.039) 0:03:12.462 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.041) 0:03:12.504 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.032) 0:03:12.537 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.038) 0:03:12.575 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.034) 0:03:12.610 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.046) 0:03:12.656 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.048) 0:03:12.705 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780788.4203124, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780788.4203124, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 333313, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728780788.4203124, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.297) 0:03:13.002 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.046) 0:03:13.048 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.045) 0:03:13.094 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.042) 0:03:13.136 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.036) 0:03:13.173 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.032) 0:03:13.206 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:53:18 -0400 (0:00:00.039) 0:03:13.245 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780788.532313, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780788.532313, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 332792, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780788.532313, "nlink": 1, "path": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:53:19 -0400 (0:00:00.298) 0:03:13.544 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:53:19 -0400 (0:00:00.586) 0:03:14.130 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.025841", "end": "2024-10-12 20:53:20.091553", "rc": 0, "start": "2024-10-12 20:53:20.065712" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 86 3b 7d 7d fe d1 39 77 01 57 f1 1f 33 ba 1b d0 f5 68 02 e8 MK salt: a6 41 06 0d 62 16 d2 32 2a b3 47 45 f1 94 a5 7c 9e 88 90 98 5e 36 66 3a 54 8c c5 3d be fe 5f 68 MK iterations: 23953 UUID: 69072daa-a86c-4059-9dc7-9ff32c5d8593 Key Slot 0: ENABLED Iterations: 383812 Salt: 56 b3 0f cd 34 8c 2c 62 58 67 bb f7 2d 89 14 51 c3 b7 d1 b3 dd 05 ad 93 d2 92 97 db c4 e5 b9 84 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.321) 0:03:14.452 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.043) 0:03:14.495 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.044) 0:03:14.540 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.045) 0:03:14.585 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.051) 0:03:14.636 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.053) 0:03:14.690 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.056) 0:03:14.746 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.044) 0:03:14.791 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 /dev/sda1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.050) 0:03:14.841 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.058) 0:03:14.899 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.090) 0:03:14.989 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.050) 0:03:15.041 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.061) 0:03:15.102 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.063) 0:03:15.166 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.051) 0:03:15.217 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:53:20 -0400 (0:00:00.052) 0:03:15.270 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.055) 0:03:15.325 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.064) 0:03:15.389 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.049) 0:03:15.439 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.053) 0:03:15.493 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.055) 0:03:15.549 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.058) 0:03:15.607 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.044) 0:03:15.652 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.048) 0:03:15.700 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.047) 0:03:15.747 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.037) 0:03:15.785 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.035) 0:03:15.820 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.036) 0:03:15.857 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.061) 0:03:15.918 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.065) 0:03:15.984 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.050) 0:03:16.034 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.041) 0:03:16.076 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.042) 0:03:16.118 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.044) 0:03:16.163 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.049) 0:03:16.213 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.035) 0:03:16.248 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:53:21 -0400 (0:00:00.035) 0:03:16.283 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.041) 0:03:16.325 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.049) 0:03:16.374 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.050) 0:03:16.425 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.052) 0:03:16.477 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.052) 0:03:16.530 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.050) 0:03:16.580 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.049) 0:03:16.630 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.044) 0:03:16.674 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.041) 0:03:16.715 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.046) 0:03:16.762 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.033) 0:03:16.796 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.038) 0:03:16.834 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.035) 0:03:16.870 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.044) 0:03:16.914 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.044) 0:03:16.959 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.045) 0:03:17.004 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.046) 0:03:17.050 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.041) 0:03:17.092 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.034) 0:03:17.126 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.034) 0:03:17.160 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.035) 0:03:17.196 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:53:22 -0400 (0:00:00.064) 0:03:17.260 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.046) 0:03:17.307 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.041) 0:03:17.348 ****** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:220 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.314) 0:03:17.663 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.113) 0:03:17.777 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.063) 0:03:17.841 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.073) 0:03:17.914 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.067) 0:03:17.982 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.055) 0:03:18.037 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.087) 0:03:18.125 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.033) 0:03:18.158 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.032) 0:03:18.191 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.034) 0:03:18.226 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:53:23 -0400 (0:00:00.041) 0:03:18.267 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:53:24 -0400 (0:00:00.221) 0:03:18.488 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:53:25 -0400 (0:00:01.365) 0:03:19.854 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:53:25 -0400 (0:00:00.045) 0:03:19.899 ****** ok: [managed-node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:53:25 -0400 (0:00:00.051) 0:03:19.950 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:53:29 -0400 (0:00:03.979) 0:03:23.929 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:53:29 -0400 (0:00:00.156) 0:03:24.085 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:53:29 -0400 (0:00:00.052) 0:03:24.138 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:53:29 -0400 (0:00:00.090) 0:03:24.228 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:53:30 -0400 (0:00:00.088) 0:03:24.317 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:53:30 -0400 (0:00:00.693) 0:03:25.011 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service": { "name": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:53:31 -0400 (0:00:00.970) 0:03:25.982 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:53:31 -0400 (0:00:00.100) 0:03:26.083 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2db040dea9\x2d044f\x2d4b42\x2da658\x2d67b331e77c81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "name": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-journald.socket systemd-readahead-replay.service systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target dev-sda.device", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-b040dea9-044f-4b42-a658-67b331e77c81", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-b040dea9-044f-4b42-a658-67b331e77c81 /dev/sda - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-b040dea9-044f-4b42-a658-67b331e77c81 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:53:32 -0400 (0:00:00.589) 0:03:26.673 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-69072daa-a86c-4059-9dc7-9ff32c5d8593' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:53:36 -0400 (0:00:03.882) 0:03:30.555 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-69072daa-a86c-4059-9dc7-9ff32c5d8593' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:53:36 -0400 (0:00:00.063) 0:03:30.618 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2db040dea9\x2d044f\x2d4b42\x2da658\x2d67b331e77c81.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "name": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2db040dea9\\x2d044f\\x2d4b42\\x2da658\\x2d67b331e77c81.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:53:36 -0400 (0:00:00.515) 0:03:31.134 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:53:36 -0400 (0:00:00.074) 0:03:31.208 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:53:36 -0400 (0:00:00.081) 0:03:31.289 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 12 October 2024 20:53:37 -0400 (0:00:00.053) 0:03:31.343 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780803.3053834, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780803.3053834, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1728780803.3053834, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "159885854", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 12 October 2024 20:53:37 -0400 (0:00:00.390) 0:03:31.734 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:244 Saturday 12 October 2024 20:53:37 -0400 (0:00:00.061) 0:03:31.797 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:53:37 -0400 (0:00:00.241) 0:03:32.039 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:53:37 -0400 (0:00:00.124) 0:03:32.163 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:53:37 -0400 (0:00:00.089) 0:03:32.252 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:53:38 -0400 (0:00:00.156) 0:03:32.408 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:53:38 -0400 (0:00:00.058) 0:03:32.467 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:53:38 -0400 (0:00:00.103) 0:03:32.571 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:53:38 -0400 (0:00:00.053) 0:03:32.624 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:53:38 -0400 (0:00:00.052) 0:03:32.676 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:53:38 -0400 (0:00:00.142) 0:03:32.819 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:53:39 -0400 (0:00:01.475) 0:03:34.295 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:53:40 -0400 (0:00:00.081) 0:03:34.376 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:53:40 -0400 (0:00:00.105) 0:03:34.482 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:53:44 -0400 (0:00:03.916) 0:03:38.399 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:53:44 -0400 (0:00:00.092) 0:03:38.491 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:53:44 -0400 (0:00:00.055) 0:03:38.547 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:53:44 -0400 (0:00:00.052) 0:03:38.600 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:53:44 -0400 (0:00:00.049) 0:03:38.649 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:53:45 -0400 (0:00:00.676) 0:03:39.326 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service": { "name": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:53:45 -0400 (0:00:00.886) 0:03:40.212 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:53:45 -0400 (0:00:00.049) 0:03:40.262 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d69072daa\x2da86c\x2d4059\x2d9dc7\x2d9ff32c5d8593.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "name": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service dev-sda1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.device", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:53:46 -0400 (0:00:00.478) 0:03:40.740 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:53:51 -0400 (0:00:04.711) 0:03:45.452 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:53:51 -0400 (0:00:00.049) 0:03:45.501 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780790.814324, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "f2e0da1b2462e59ef142173e5735dbd16daab710", "ctime": 1728780790.811324, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780790.811324, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:53:51 -0400 (0:00:00.365) 0:03:45.867 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:53:51 -0400 (0:00:00.324) 0:03:46.191 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d69072daa\x2da86c\x2d4059\x2d9dc7\x2d9ff32c5d8593.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "name": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:53:52 -0400 (0:00:00.526) 0:03:46.718 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:53:52 -0400 (0:00:00.072) 0:03:46.790 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:53:52 -0400 (0:00:00.065) 0:03:46.856 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:53:52 -0400 (0:00:00.056) 0:03:46.913 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-69072daa-a86c-4059-9dc7-9ff32c5d8593" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:53:53 -0400 (0:00:00.391) 0:03:47.305 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:53:53 -0400 (0:00:00.454) 0:03:47.760 ****** changed: [managed-node2] => (item={u'src': u'UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:53:53 -0400 (0:00:00.369) 0:03:48.129 ****** skipping: [managed-node2] => (item={u'src': u'UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:53:53 -0400 (0:00:00.139) 0:03:48.269 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:53:54 -0400 (0:00:00.476) 0:03:48.745 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780794.3653407, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "062e1208786cec713975cdbbe2e0ded8cbcc3df7", "ctime": 1728780792.3983314, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728780792.3983314, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 54, "uid": 0, "version": "1741554342", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:53:54 -0400 (0:00:00.345) 0:03:49.091 ****** changed: [managed-node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-69072daa-a86c-4059-9dc7-9ff32c5d8593', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:53:55 -0400 (0:00:00.381) 0:03:49.472 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:261 Saturday 12 October 2024 20:53:55 -0400 (0:00:00.720) 0:03:50.193 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:53:56 -0400 (0:00:00.151) 0:03:50.345 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:53:56 -0400 (0:00:00.068) 0:03:50.413 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:53:56 -0400 (0:00:00.059) 0:03:50.472 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "d07ed621-1c1a-4c14-a806-9b6c8aba7045" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:53:56 -0400 (0:00:00.360) 0:03:50.833 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002900", "end": "2024-10-12 20:53:56.809221", "rc": 0, "start": "2024-10-12 20:53:56.806321" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:53:56 -0400 (0:00:00.355) 0:03:51.189 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002759", "end": "2024-10-12 20:53:57.169291", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:53:57.166532" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.363) 0:03:51.553 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.118) 0:03:51.672 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.052) 0:03:51.724 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.051) 0:03:51.776 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.054) 0:03:51.830 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.120) 0:03:51.950 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.052) 0:03:52.003 ****** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.045) 0:03:52.048 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.054) 0:03:52.102 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.054) 0:03:52.157 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.052) 0:03:52.210 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:53:57 -0400 (0:00:00.056) 0:03:52.266 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.051) 0:03:52.317 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.049) 0:03:52.366 ****** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.053) 0:03:52.419 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.235) 0:03:52.654 ****** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.031) 0:03:52.686 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.072) 0:03:52.759 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.035) 0:03:52.794 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.033) 0:03:52.828 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.032) 0:03:52.860 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.033) 0:03:52.893 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.035) 0:03:52.929 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.076) 0:03:53.005 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.034) 0:03:53.040 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.033) 0:03:53.073 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.034) 0:03:53.107 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.034) 0:03:53.142 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.034) 0:03:53.177 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:53:58 -0400 (0:00:00.073) 0:03:53.250 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.067) 0:03:53.318 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.081) 0:03:53.399 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.049) 0:03:53.448 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.084) 0:03:53.532 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.044) 0:03:53.577 ****** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.033) 0:03:53.611 ****** TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.031) 0:03:53.642 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.033) 0:03:53.676 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.078) 0:03:53.755 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/sda1', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'_kernel_device': u'/dev/sda1', u'encryption': False, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': 0, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/sda1", "_kernel_device": "/dev/sda1", "_mount_id": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.050) 0:03:53.805 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.086) 0:03:53.892 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.034) 0:03:53.926 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.032) 0:03:53.959 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.036) 0:03:53.995 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.033) 0:03:54.029 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.032) 0:03:54.061 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.032) 0:03:54.094 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.033) 0:03:54.128 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.067) 0:03:54.196 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:53:59 -0400 (0:00:00.044) 0:03:54.240 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.171) 0:03:54.411 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.040) 0:03:54.452 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.043) 0:03:54.496 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.036) 0:03:54.532 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.052) 0:03:54.585 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.034) 0:03:54.619 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.034) 0:03:54.653 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.033) 0:03:54.687 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.032) 0:03:54.720 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.035) 0:03:54.755 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.034) 0:03:54.789 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.034) 0:03:54.824 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.058) 0:03:54.883 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.042) 0:03:54.926 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.043) 0:03:54.969 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.034) 0:03:55.004 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.039) 0:03:55.043 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.035) 0:03:55.079 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.046) 0:03:55.125 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:54:00 -0400 (0:00:00.049) 0:03:55.175 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780831.0365157, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780831.0365157, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 344572, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728780831.0365157, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.310) 0:03:55.486 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.046) 0:03:55.532 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.035) 0:03:55.568 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.053) 0:03:55.622 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.041) 0:03:55.663 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.034) 0:03:55.697 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.040) 0:03:55.737 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:54:01 -0400 (0:00:00.033) 0:03:55.771 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.574) 0:03:56.345 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.046) 0:03:56.391 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.042) 0:03:56.434 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.046) 0:03:56.480 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.034) 0:03:56.515 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.035) 0:03:56.551 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.044) 0:03:56.595 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.059) 0:03:56.654 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.052) 0:03:56.707 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.053) 0:03:56.760 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.054) 0:03:56.815 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.038) 0:03:56.853 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.034) 0:03:56.887 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.036) 0:03:56.924 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.040) 0:03:56.964 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.050) 0:03:57.015 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.053) 0:03:57.069 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.055) 0:03:57.124 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.054) 0:03:57.179 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.058) 0:03:57.237 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:54:02 -0400 (0:00:00.054) 0:03:57.292 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.050) 0:03:57.343 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.042) 0:03:57.385 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.044) 0:03:57.430 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.048) 0:03:57.479 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.042) 0:03:57.522 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.037) 0:03:57.559 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.037) 0:03:57.597 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.038) 0:03:57.635 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.038) 0:03:57.674 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.043) 0:03:57.718 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.070) 0:03:57.788 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.054) 0:03:57.843 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.045) 0:03:57.888 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.043) 0:03:57.931 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.044) 0:03:57.976 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.035) 0:03:58.012 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.037) 0:03:58.050 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.035) 0:03:58.086 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.050) 0:03:58.136 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.051) 0:03:58.188 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.052) 0:03:58.241 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:54:03 -0400 (0:00:00.047) 0:03:58.289 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.049) 0:03:58.338 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.046) 0:03:58.385 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.043) 0:03:58.428 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.039) 0:03:58.468 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.034) 0:03:58.502 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.034) 0:03:58.537 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.041) 0:03:58.578 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.082) 0:03:58.661 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.044) 0:03:58.705 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.054) 0:03:58.760 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.053) 0:03:58.814 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.043) 0:03:58.857 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.042) 0:03:58.900 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.043) 0:03:58.943 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.034) 0:03:58.978 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.034) 0:03:59.012 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.040) 0:03:59.053 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.049) 0:03:59.102 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 12 October 2024 20:54:04 -0400 (0:00:00.052) 0:03:59.155 ****** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:267 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.324) 0:03:59.479 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.096) 0:03:59.576 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.040) 0:03:59.616 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.049) 0:03:59.665 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.065) 0:03:59.730 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.061) 0:03:59.792 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.146) 0:03:59.938 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.054) 0:03:59.993 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.056) 0:04:00.049 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.054) 0:04:00.103 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.043) 0:04:00.147 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:54:05 -0400 (0:00:00.105) 0:04:00.252 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:54:07 -0400 (0:00:01.365) 0:04:01.618 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:54:07 -0400 (0:00:00.105) 0:04:01.723 ****** ok: [managed-node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:54:07 -0400 (0:00:00.067) 0:04:01.791 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:54:11 -0400 (0:00:03.860) 0:04:05.652 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:54:11 -0400 (0:00:00.097) 0:04:05.749 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:54:11 -0400 (0:00:00.047) 0:04:05.797 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:54:11 -0400 (0:00:00.051) 0:04:05.848 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:54:11 -0400 (0:00:00.051) 0:04:05.899 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:54:12 -0400 (0:00:00.720) 0:04:06.620 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service": { "name": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:54:13 -0400 (0:00:00.945) 0:04:07.565 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:54:13 -0400 (0:00:00.057) 0:04:07.623 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d69072daa\x2da86c\x2d4059\x2d9dc7\x2d9ff32c5d8593.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "name": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-readahead-collect.service systemd-readahead-replay.service dev-sda1.device systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-69072daa-a86c-4059-9dc7-9ff32c5d8593", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 /dev/sda1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-69072daa-a86c-4059-9dc7-9ff32c5d8593 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:54:13 -0400 (0:00:00.459) 0:04:08.083 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:54:17 -0400 (0:00:03.716) 0:04:11.799 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'partition', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'sda1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:54:17 -0400 (0:00:00.073) 0:04:11.872 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d69072daa\x2da86c\x2d4059\x2d9dc7\x2d9ff32c5d8593.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "name": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d69072daa\\x2da86c\\x2d4059\\x2d9dc7\\x2d9ff32c5d8593.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:54:18 -0400 (0:00:00.549) 0:04:12.421 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:54:18 -0400 (0:00:00.053) 0:04:12.475 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:54:18 -0400 (0:00:00.075) 0:04:12.550 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 12 October 2024 20:54:18 -0400 (0:00:00.047) 0:04:12.598 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780845.1205828, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780845.1205828, "dev": 2049, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1728780845.1205828, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744072012900473", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 12 October 2024 20:54:18 -0400 (0:00:00.329) 0:04:12.927 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Create a key file] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:293 Saturday 12 October 2024 20:54:18 -0400 (0:00:00.049) 0:04:12.977 ****** ok: [managed-node2] => { "changed": false, "gid": 0, "group": "root", "mode": "0600", "owner": "root", "path": "/tmp/storage_testxiCLrelukskey", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Write the key into the key file] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:300 Saturday 12 October 2024 20:54:19 -0400 (0:00:00.458) 0:04:13.436 ****** ok: [managed-node2] => { "changed": false, "checksum": "7a4dff3752e2baf5617c57eaac048e2b95e8af91", "dest": "/tmp/storage_testxiCLrelukskey", "gid": 0, "group": "root", "md5sum": "4ac07b967150835c00d0865161e48744", "mode": "0600", "owner": "root", "secontext": "unconfined_u:object_r:user_tmp_t:s0", "size": 32, "src": "/root/.ansible/tmp/ansible-tmp-1728780859.18-17944-45428675303946/source", "state": "file", "uid": 0 } TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:307 Saturday 12 October 2024 20:54:19 -0400 (0:00:00.784) 0:04:14.221 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:54:19 -0400 (0:00:00.059) 0:04:14.280 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.066) 0:04:14.347 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.056) 0:04:14.404 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.102) 0:04:14.506 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.040) 0:04:14.547 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.041) 0:04:14.588 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.032) 0:04:14.621 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.032) 0:04:14.653 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:54:20 -0400 (0:00:00.097) 0:04:14.751 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:54:21 -0400 (0:00:01.391) 0:04:16.142 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "partition", "volumes": [ { "encryption": true, "encryption_key": "/tmp/storage_testxiCLrelukskey", "mount_point": "/opt/test1", "name": "test1", "size": "4g", "type": "partition" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:54:21 -0400 (0:00:00.077) 0:04:16.220 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:54:21 -0400 (0:00:00.050) 0:04:16.271 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:54:26 -0400 (0:00:04.098) 0:04:20.370 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:54:26 -0400 (0:00:00.095) 0:04:20.466 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:54:26 -0400 (0:00:00.055) 0:04:20.521 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:54:26 -0400 (0:00:00.050) 0:04:20.572 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:54:26 -0400 (0:00:00.046) 0:04:20.618 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:54:27 -0400 (0:00:00.683) 0:04:21.302 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:54:28 -0400 (0:00:01.010) 0:04:22.312 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:54:28 -0400 (0:00:00.073) 0:04:22.385 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:54:28 -0400 (0:00:00.043) 0:04:22.429 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:54:38 -0400 (0:00:10.562) 0:04:32.992 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:54:38 -0400 (0:00:00.065) 0:04:33.057 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780833.7345285, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "c39be91dab65eae87857d0b83fd706ec36e0a39f", "ctime": 1728780833.7315285, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780833.7315285, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1299, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:54:39 -0400 (0:00:00.366) 0:04:33.424 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:54:39 -0400 (0:00:00.360) 0:04:33.785 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:54:39 -0400 (0:00:00.044) 0:04:33.829 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/sda1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:54:39 -0400 (0:00:00.077) 0:04:33.907 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:54:39 -0400 (0:00:00.063) 0:04:33.971 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:54:39 -0400 (0:00:00.056) 0:04:34.027 ****** changed: [managed-node2] => (item={u'src': u'UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "UUID=d07ed621-1c1a-4c14-a806-9b6c8aba7045" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:54:40 -0400 (0:00:00.396) 0:04:34.424 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:54:40 -0400 (0:00:00.483) 0:04:34.907 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:54:40 -0400 (0:00:00.326) 0:04:35.234 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:54:40 -0400 (0:00:00.043) 0:04:35.277 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:54:41 -0400 (0:00:00.428) 0:04:35.705 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780837.168545, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780835.085535, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1728780835.085535, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1741554527", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:54:41 -0400 (0:00:00.301) 0:04:36.007 ****** changed: [managed-node2] => (item={u'state': u'present', u'password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'name': u'luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:54:42 -0400 (0:00:00.324) 0:04:36.332 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:324 Saturday 12 October 2024 20:54:42 -0400 (0:00:00.628) 0:04:36.960 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:54:42 -0400 (0:00:00.057) 0:04:37.018 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "partition", "volumes": [ { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:54:42 -0400 (0:00:00.047) 0:04:37.065 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:54:42 -0400 (0:00:00.034) 0:04:37.099 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "size": "10G", "type": "crypt", "uuid": "620d5afd-d9a4-4ada-a87d-8f3a56def5bb" }, "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sda1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/sda1", "size": "10G", "type": "partition", "uuid": "eb17a9a2-7591-4eb0-9785-67e5637fbe1b" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.311) 0:04:37.411 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002818", "end": "2024-10-12 20:54:43.344655", "rc": 0, "start": "2024-10-12 20:54:43.341837" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.292) 0:04:37.704 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002528", "end": "2024-10-12 20:54:43.636805", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:54:43.634277" } STDOUT: luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.287) 0:04:37.991 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.077) 0:04:38.068 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.033) 0:04:38.102 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.033) 0:04:38.135 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:54:43 -0400 (0:00:00.039) 0:04:38.175 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.137) 0:04:38.312 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.052) 0:04:38.365 ****** TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.047) 0:04:38.412 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.045) 0:04:38.458 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.042) 0:04:38.501 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.044) 0:04:38.546 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.039) 0:04:38.585 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.039) 0:04:38.625 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.034) 0:04:38.659 ****** TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.032) 0:04:38.692 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.246) 0:04:38.938 ****** TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.031) 0:04:38.970 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.071) 0:04:39.041 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.038) 0:04:39.080 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.049) 0:04:39.129 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.050) 0:04:39.179 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.054) 0:04:39.234 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:54:44 -0400 (0:00:00.048) 0:04:39.283 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.056) 0:04:39.339 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.052) 0:04:39.392 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.050) 0:04:39.443 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.042) 0:04:39.486 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.044) 0:04:39.531 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.046) 0:04:39.577 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.078) 0:04:39.656 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_lvmraid_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_lvmraid_volume": { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.047) 0:04:39.703 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.075) 0:04:39.779 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_thin_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_thin_volume": { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.054) 0:04:39.834 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.110) 0:04:39.944 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.050) 0:04:39.995 ****** TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.042) 0:04:40.037 ****** TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.039) 0:04:40.077 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.036) 0:04:40.114 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.087) 0:04:40.201 ****** skipping: [managed-node2] => (item={u'_raw_device': u'/dev/sda1', u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'_device': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'_kernel_device': u'/dev/dm-0', u'encryption': True, u'raid_level': None, u'name': u'test1', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'partition', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'_mount_id': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'mount_user': None, u'raid_spare_count': None, u'raid_disks': [], u'_raw_kernel_device': u'/dev/sda1', u'cache_mode': None, u'cache_devices': [], u'deduplication': None, u'mount_group': None, u'thin_pool_size': None, u'disks': [u'sda'], u'cached': False, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'fs_overwrite_existing': True, u'fs_create_options': u''}) => { "ansible_loop_var": "storage_test_vdo_volume", "changed": false, "skip_reason": "Conditional result was False", "storage_test_vdo_volume": { "_device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "_raw_device": "/dev/sda1", "_raw_kernel_device": "/dev/sda1", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "partition", "vdo_pool_size": null } } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:54:45 -0400 (0:00:00.070) 0:04:40.271 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.147) 0:04:40.419 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.046) 0:04:40.465 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.045) 0:04:40.511 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.037) 0:04:40.549 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.032) 0:04:40.582 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.032) 0:04:40.614 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.032) 0:04:40.646 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.036) 0:04:40.683 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.065) 0:04:40.748 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.049) 0:04:40.798 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.256) 0:04:41.054 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.040) 0:04:41.095 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.049) 0:04:41.144 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.050) 0:04:41.194 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:54:46 -0400 (0:00:00.063) 0:04:41.257 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.044) 0:04:41.302 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.049) 0:04:41.352 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.052) 0:04:41.405 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.045) 0:04:41.451 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.037) 0:04:41.488 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.034) 0:04:41.523 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.035) 0:04:41.558 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.063) 0:04:41.622 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.044) 0:04:41.666 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.041) 0:04:41.708 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.034) 0:04:41.742 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.040) 0:04:41.783 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.036) 0:04:41.819 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.051) 0:04:41.871 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:54:47 -0400 (0:00:00.069) 0:04:41.940 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780878.4327416, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780878.4327416, "dev": 5, "device_type": 2049, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 354088, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728780878.4327416, "nlink": 1, "path": "/dev/sda1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.363) 0:04:42.304 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.051) 0:04:42.355 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.035) 0:04:42.390 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.052) 0:04:42.442 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "partition" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.052) 0:04:42.495 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.048) 0:04:42.543 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.050) 0:04:42.594 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780878.5467422, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780878.5467422, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 354129, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780878.5467422, "nlink": 1, "path": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:54:48 -0400 (0:00:00.328) 0:04:42.922 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.616) 0:04:43.538 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/sda1" ], "delta": "0:00:00.025783", "end": "2024-10-12 20:54:49.537708", "rc": 0, "start": "2024-10-12 20:54:49.511925" } STDOUT: LUKS header information for /dev/sda1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 6a 00 92 3f fa 03 03 ca 31 31 a3 e6 61 c9 a4 81 03 10 8a 71 MK salt: 19 1d b1 ef d1 38 2e 8b f6 06 7b f6 53 57 af 92 1a 65 a8 2a 59 bc 68 ec e7 a4 48 63 38 0c bf a5 MK iterations: 23848 UUID: eb17a9a2-7591-4eb0-9785-67e5637fbe1b Key Slot 0: ENABLED Iterations: 382690 Salt: 14 d3 fc 6f ae 9c a4 2b 7e 24 0f 7a 63 3a a9 a0 70 93 23 6c 0b 79 21 71 02 06 9d 24 fd 0e c5 32 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.377) 0:04:43.916 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.064) 0:04:43.980 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.066) 0:04:44.047 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.065) 0:04:44.113 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.063) 0:04:44.176 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.057) 0:04:44.234 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:54:49 -0400 (0:00:00.055) 0:04:44.290 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.058) 0:04:44.348 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.064) 0:04:44.413 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.060) 0:04:44.474 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.075) 0:04:44.549 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.062) 0:04:44.611 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.056) 0:04:44.668 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.039) 0:04:44.707 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.033) 0:04:44.740 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.033) 0:04:44.774 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.041) 0:04:44.815 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.052) 0:04:44.867 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.127) 0:04:44.994 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.057) 0:04:45.051 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.056) 0:04:45.107 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.056) 0:04:45.164 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.055) 0:04:45.220 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:54:50 -0400 (0:00:00.055) 0:04:45.275 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.064) 0:04:45.339 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.062) 0:04:45.402 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.057) 0:04:45.459 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.043) 0:04:45.502 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.052) 0:04:45.555 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.056) 0:04:45.611 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.047) 0:04:45.659 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.038) 0:04:45.697 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.037) 0:04:45.734 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.033) 0:04:45.768 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.033) 0:04:45.801 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.033) 0:04:45.835 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.035) 0:04:45.871 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.033) 0:04:45.905 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.039) 0:04:45.944 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.050) 0:04:45.995 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.049) 0:04:46.044 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.049) 0:04:46.094 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.057) 0:04:46.151 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.055) 0:04:46.207 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:54:51 -0400 (0:00:00.054) 0:04:46.262 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.052) 0:04:46.314 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.053) 0:04:46.368 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.053) 0:04:46.421 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.061) 0:04:46.483 ****** ok: [managed-node2] => { "storage_test_expected_size": "VARIABLE IS NOT DEFINED!: 'storage_test_expected_size' is undefined" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.057) 0:04:46.541 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.056) 0:04:46.597 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.057) 0:04:46.655 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.059) 0:04:46.714 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.052) 0:04:46.767 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.055) 0:04:46.822 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.051) 0:04:46.874 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.051) 0:04:46.925 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.051) 0:04:46.976 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.054) 0:04:47.031 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.046) 0:04:47.077 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Remove the key file] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:327 Saturday 12 October 2024 20:54:52 -0400 (0:00:00.056) 0:04:47.133 ****** ok: [managed-node2] => { "changed": false, "path": "/tmp/storage_testxiCLrelukskey", "state": "absent" } TASK [Test for correct handling of new encrypted volume w/ no key] ************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:337 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.354) 0:04:47.488 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.083) 0:04:47.571 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.063) 0:04:47.634 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.095) 0:04:47.730 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.089) 0:04:47.819 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.064) 0:04:47.884 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.124) 0:04:48.009 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.050) 0:04:48.060 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.049) 0:04:48.110 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.080) 0:04:48.191 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:54:53 -0400 (0:00:00.055) 0:04:48.247 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:54:54 -0400 (0:00:00.135) 0:04:48.382 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:54:55 -0400 (0:00:01.386) 0:04:49.769 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:54:55 -0400 (0:00:00.085) 0:04:49.854 ****** ok: [managed-node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:54:55 -0400 (0:00:00.194) 0:04:50.049 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:54:59 -0400 (0:00:04.086) 0:04:54.135 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:54:59 -0400 (0:00:00.090) 0:04:54.226 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:54:59 -0400 (0:00:00.063) 0:04:54.290 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:55:00 -0400 (0:00:00.063) 0:04:54.353 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:55:00 -0400 (0:00:00.054) 0:04:54.407 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:55:00 -0400 (0:00:00.712) 0:04:55.119 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:55:01 -0400 (0:00:01.079) 0:04:56.199 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:55:02 -0400 (0:00:00.114) 0:04:56.314 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:55:02 -0400 (0:00:00.064) 0:04:56.378 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: encrypted volume 'test1' missing key/password TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:55:06 -0400 (0:00:04.144) 0:05:00.523 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': False, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': None, u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"encrypted volume 'test1' missing key/password"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.052) 0:05:00.575 ****** TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.032) 0:05:00.608 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.041) 0:05:00.649 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.050) 0:05:00.699 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Create an encrypted lvm volume w/ default fs] **************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:355 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.033) 0:05:00.733 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.067) 0:05:00.800 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.054) 0:05:00.855 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.042) 0:05:00.897 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.082) 0:05:00.980 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.034) 0:05:01.014 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.035) 0:05:01.050 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.034) 0:05:01.085 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.034) 0:05:01.119 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:55:06 -0400 (0:00:00.078) 0:05:01.198 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:55:08 -0400 (0:00:01.225) 0:05:02.423 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:55:08 -0400 (0:00:00.048) 0:05:02.472 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:55:08 -0400 (0:00:00.036) 0:05:02.508 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:55:12 -0400 (0:00:04.041) 0:05:06.549 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:55:12 -0400 (0:00:00.061) 0:05:06.610 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:55:12 -0400 (0:00:00.030) 0:05:06.641 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:55:12 -0400 (0:00:00.034) 0:05:06.675 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:55:12 -0400 (0:00:00.078) 0:05:06.754 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:55:13 -0400 (0:00:00.577) 0:05:07.332 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:55:13 -0400 (0:00:00.862) 0:05:08.194 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:55:13 -0400 (0:00:00.055) 0:05:08.249 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:55:13 -0400 (0:00:00.031) 0:05:08.280 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4780f917-80bf-4089-913a-4e99164f31be", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:55:24 -0400 (0:00:10.822) 0:05:19.103 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:55:24 -0400 (0:00:00.053) 0:05:19.157 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780880.8597534, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "74026b9634e28f96bb8df2bde6dce70a2acfa994", "ctime": 1728780880.8567533, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780880.8567533, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:55:25 -0400 (0:00:00.542) 0:05:19.700 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:55:25 -0400 (0:00:00.472) 0:05:20.172 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:55:25 -0400 (0:00:00.057) 0:05:20.230 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/sda1", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "disklabel" }, { "action": "create format", "device": "/dev/sda", "fs_type": "lvmpv" }, { "action": "create device", "device": "/dev/foo", "fs_type": null }, { "action": "create device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/sda1", "name": "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "password": "-", "state": "absent" }, { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4780f917-80bf-4089-913a-4e99164f31be", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:55:26 -0400 (0:00:00.143) 0:05:20.374 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:55:26 -0400 (0:00:00.071) 0:05:20.445 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:55:26 -0400 (0:00:00.078) 0:05:20.524 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:55:26 -0400 (0:00:00.408) 0:05:20.932 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:55:27 -0400 (0:00:00.662) 0:05:21.595 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:55:27 -0400 (0:00:00.637) 0:05:22.233 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:55:28 -0400 (0:00:00.112) 0:05:22.345 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:55:28 -0400 (0:00:00.771) 0:05:23.117 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780883.6357665, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "1e307b4814b48cd4c24fdc78956a8bfc9ada8a20", "ctime": 1728780881.9617586, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917512, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728780881.9607584, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 88, "uid": 0, "version": "1741554707", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:55:29 -0400 (0:00:00.400) 0:05:23.517 ****** changed: [managed-node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b', u'backing_device': u'/dev/sda1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/sda1", "name": "luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed changed: [managed-node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-4780f917-80bf-4089-913a-4e99164f31be', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4780f917-80bf-4089-913a-4e99164f31be", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:55:29 -0400 (0:00:00.595) 0:05:24.113 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:374 Saturday 12 October 2024 20:55:30 -0400 (0:00:00.703) 0:05:24.816 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:55:30 -0400 (0:00:00.077) 0:05:24.893 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [], "encryption": true, "encryption_cipher": "aes-xts-plain64", "encryption_key": null, "encryption_key_size": 512, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:55:30 -0400 (0:00:00.072) 0:05:24.966 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:55:30 -0400 (0:00:00.052) 0:05:25.018 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "4780f917-80bf-4089-913a-4e99164f31be" }, "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "size": "4G", "type": "crypt", "uuid": "7eb1e6a9-0886-471f-968a-605a5b64e85c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:55:31 -0400 (0:00:00.360) 0:05:25.379 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002608", "end": "2024-10-12 20:55:31.307538", "rc": 0, "start": "2024-10-12 20:55:31.304930" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:55:31 -0400 (0:00:00.286) 0:05:25.665 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002665", "end": "2024-10-12 20:55:31.617724", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:55:31.615059" } STDOUT: luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:55:31 -0400 (0:00:00.311) 0:05:25.976 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:55:31 -0400 (0:00:00.122) 0:05:26.099 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:55:31 -0400 (0:00:00.035) 0:05:26.134 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017999", "end": "2024-10-12 20:55:32.088638", "rc": 0, "start": "2024-10-12 20:55:32.070639" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.312) 0:05:26.447 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.046) 0:05:26.494 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.077) 0:05:26.572 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.048) 0:05:26.620 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.477) 0:05:27.098 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.040) 0:05:27.138 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.042) 0:05:27.181 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.044) 0:05:27.225 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:55:32 -0400 (0:00:00.040) 0:05:27.266 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.058) 0:05:27.324 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.046) 0:05:27.371 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.076) 0:05:27.447 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.229) 0:05:27.676 ****** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.045) 0:05:27.722 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.110) 0:05:27.833 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.056) 0:05:27.890 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.051) 0:05:27.941 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.056) 0:05:27.997 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.054) 0:05:28.052 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.052) 0:05:28.104 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.052) 0:05:28.157 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.055) 0:05:28.212 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:55:33 -0400 (0:00:00.053) 0:05:28.265 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.041) 0:05:28.307 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.054) 0:05:28.361 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.047) 0:05:28.408 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.117) 0:05:28.526 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.129) 0:05:28.656 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.066) 0:05:28.722 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.053) 0:05:28.776 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.043) 0:05:28.819 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.045) 0:05:28.865 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.039) 0:05:28.904 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.039) 0:05:28.944 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.049) 0:05:28.994 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.107) 0:05:29.101 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.085) 0:05:29.187 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.036) 0:05:29.224 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.033) 0:05:29.257 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 12 October 2024 20:55:34 -0400 (0:00:00.039) 0:05:29.296 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.055) 0:05:29.352 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.160) 0:05:29.513 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.212) 0:05:29.726 ****** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.066) 0:05:29.793 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.146) 0:05:29.940 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.056) 0:05:29.996 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.066) 0:05:30.063 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.054) 0:05:30.118 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.047) 0:05:30.166 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.036) 0:05:30.202 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.038) 0:05:30.241 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:55:35 -0400 (0:00:00.058) 0:05:30.300 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.117) 0:05:30.417 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.157) 0:05:30.574 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.053) 0:05:30.628 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.056) 0:05:30.684 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.057) 0:05:30.742 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.053) 0:05:30.796 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.044) 0:05:30.840 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.049) 0:05:30.890 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.054) 0:05:30.945 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.129) 0:05:31.074 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.041) 0:05:31.116 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.045) 0:05:31.161 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.042) 0:05:31.204 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.042) 0:05:31.247 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:55:36 -0400 (0:00:00.034) 0:05:31.281 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.034) 0:05:31.315 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.036) 0:05:31.352 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.069) 0:05:31.422 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.048) 0:05:31.470 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.240) 0:05:31.711 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.045) 0:05:31.757 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.046) 0:05:31.803 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.036) 0:05:31.839 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.051) 0:05:31.891 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.053) 0:05:31.944 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.052) 0:05:31.997 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.052) 0:05:32.049 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.065) 0:05:32.114 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.078) 0:05:32.193 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:55:37 -0400 (0:00:00.084) 0:05:32.277 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.059) 0:05:32.337 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.096) 0:05:32.433 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.065) 0:05:32.498 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.050) 0:05:32.549 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.050) 0:05:32.600 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.060) 0:05:32.660 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.044) 0:05:32.705 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.061) 0:05:32.766 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.070) 0:05:32.837 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780924.5489616, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780924.5489616, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 364935, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780924.5489616, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.376) 0:05:33.214 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.045) 0:05:33.259 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:55:38 -0400 (0:00:00.034) 0:05:33.294 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:55:39 -0400 (0:00:00.050) 0:05:33.344 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:55:39 -0400 (0:00:00.059) 0:05:33.404 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:55:39 -0400 (0:00:00.056) 0:05:33.461 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:55:39 -0400 (0:00:00.055) 0:05:33.516 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780924.660962, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780924.660962, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 364998, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780924.660962, "nlink": 1, "path": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:55:39 -0400 (0:00:00.431) 0:05:33.947 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:55:40 -0400 (0:00:00.641) 0:05:34.589 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.025498", "end": "2024-10-12 20:55:40.721744", "rc": 0, "start": "2024-10-12 20:55:40.696246" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: d7 80 0e 05 a6 b7 4f de 32 a1 f1 69 56 0c de 9e a8 bb 28 15 MK salt: 73 68 87 5a c4 31 b4 41 b4 ec 16 ba d3 96 21 4d 3d 70 73 10 7a 38 a8 27 d7 65 c3 00 45 9d 7b 07 MK iterations: 23988 UUID: 4780f917-80bf-4089-913a-4e99164f31be Key Slot 0: ENABLED Iterations: 383812 Salt: 73 84 e2 91 25 74 b3 68 81 e2 f5 1d df 20 7e 5e 95 86 c3 2c 6d 1a 01 d8 76 f1 60 67 69 88 fb d7 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:55:40 -0400 (0:00:00.552) 0:05:35.142 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:55:40 -0400 (0:00:00.083) 0:05:35.226 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.076) 0:05:35.302 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.125) 0:05:35.427 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.139) 0:05:35.567 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.084) 0:05:35.651 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.094) 0:05:35.745 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.084) 0:05:35.830 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.135) 0:05:35.966 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.090) 0:05:36.057 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.087) 0:05:36.144 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:55:41 -0400 (0:00:00.095) 0:05:36.240 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.079) 0:05:36.320 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.047) 0:05:36.367 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.036) 0:05:36.404 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.045) 0:05:36.450 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.074) 0:05:36.525 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.062) 0:05:36.587 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.051) 0:05:36.638 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.051) 0:05:36.690 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.054) 0:05:36.744 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.072) 0:05:36.816 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.057) 0:05:36.873 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:55:42 -0400 (0:00:00.053) 0:05:36.927 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:55:43 -0400 (0:00:00.650) 0:05:37.578 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:55:43 -0400 (0:00:00.407) 0:05:37.986 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:55:43 -0400 (0:00:00.116) 0:05:38.102 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:55:43 -0400 (0:00:00.062) 0:05:38.165 ****** ok: [managed-node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.570) 0:05:38.735 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.104) 0:05:38.840 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.140) 0:05:38.980 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.087) 0:05:39.068 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.077) 0:05:39.146 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.054) 0:05:39.200 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:55:44 -0400 (0:00:00.052) 0:05:39.252 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.051) 0:05:39.304 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.064) 0:05:39.368 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.057) 0:05:39.426 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.052) 0:05:39.478 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.076) 0:05:39.555 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.064) 0:05:39.619 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.086) 0:05:39.705 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.107) 0:05:39.813 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.065) 0:05:39.878 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.062) 0:05:39.941 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.094) 0:05:40.035 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.056) 0:05:40.091 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.071) 0:05:40.163 ****** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:55:45 -0400 (0:00:00.084) 0:05:40.247 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.135) 0:05:40.383 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.149) 0:05:40.532 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.019348", "end": "2024-10-12 20:55:46.569633", "rc": 0, "start": "2024-10-12 20:55:46.550285" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.424) 0:05:40.957 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.067) 0:05:41.024 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.086) 0:05:41.111 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.058) 0:05:41.170 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.058) 0:05:41.228 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:55:46 -0400 (0:00:00.060) 0:05:41.288 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.056) 0:05:41.345 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.049) 0:05:41.394 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.050) 0:05:41.445 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Verify preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:377 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.050) 0:05:41.496 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.109) 0:05:41.606 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.087) 0:05:41.693 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.065) 0:05:41.759 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.126) 0:05:41.885 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.053) 0:05:41.938 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.056) 0:05:41.995 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.056) 0:05:42.051 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.053) 0:05:42.104 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:55:47 -0400 (0:00:00.157) 0:05:42.262 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:55:49 -0400 (0:00:01.499) 0:05:43.762 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:55:49 -0400 (0:00:00.042) 0:05:43.804 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:55:49 -0400 (0:00:00.037) 0:05:43.842 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:55:53 -0400 (0:00:03.932) 0:05:47.775 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:55:53 -0400 (0:00:00.111) 0:05:47.886 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:55:53 -0400 (0:00:00.048) 0:05:47.935 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:55:53 -0400 (0:00:00.054) 0:05:47.989 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:55:53 -0400 (0:00:00.047) 0:05:48.036 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:55:54 -0400 (0:00:00.725) 0:05:48.761 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service": { "name": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:55:55 -0400 (0:00:01.106) 0:05:49.868 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:55:55 -0400 (0:00:00.053) 0:05:49.921 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2deb17a9a2\x2d7591\x2d4eb0\x2d9785\x2d67e5637fbe1b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "name": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-replay.service systemd-readahead-collect.service dev-sda1.device cryptsetup-pre.target systemd-journald.socket system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-sda1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b /dev/sda1 VALUE_SPECIFIED_IN_NO_LOG_PARAMETER ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-eb17a9a2-7591-4eb0-9785-67e5637fbe1b ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-sda1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:55:56 -0400 (0:00:00.588) 0:05:50.509 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:56:00 -0400 (0:00:04.117) 0:05:54.627 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:56:00 -0400 (0:00:00.069) 0:05:54.697 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780927.796977, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "808db3331cf2470840def9e89b6b0f18940090c3", "ctime": 1728780927.793977, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780927.793977, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:56:00 -0400 (0:00:00.440) 0:05:55.137 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:56:00 -0400 (0:00:00.099) 0:05:55.237 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2deb17a9a2\x2d7591\x2d4eb0\x2d9785\x2d67e5637fbe1b.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "name": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2deb17a9a2\\x2d7591\\x2d4eb0\\x2d9785\\x2d67e5637fbe1b.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:56:01 -0400 (0:00:00.724) 0:05:55.961 ****** ok: [managed-node2] => { "blivet_output": { "actions": [], "changed": false, "crypts": [], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" ], "mounts": [ { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:56:01 -0400 (0:00:00.100) 0:05:56.062 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:56:01 -0400 (0:00:00.091) 0:05:56.154 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:56:01 -0400 (0:00:00.061) 0:05:56.215 ****** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:56:01 -0400 (0:00:00.054) 0:05:56.270 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:56:02 -0400 (0:00:00.565) 0:05:56.836 ****** ok: [managed-node2] => (item={u'src': u'/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:56:02 -0400 (0:00:00.424) 0:05:57.260 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:56:03 -0400 (0:00:00.083) 0:05:57.344 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:56:03 -0400 (0:00:00.526) 0:05:57.871 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780931.6159952, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9401888b8c170cf9ab1edb257581ae05abda223a", "ctime": 1728780929.7469864, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728780929.7469864, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1741554872", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:56:03 -0400 (0:00:00.390) 0:05:58.262 ****** TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:56:04 -0400 (0:00:00.107) 0:05:58.369 ****** ok: [managed-node2] TASK [Assert preservation of encryption settings on existing LVM volume] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:391 Saturday 12 October 2024 20:56:04 -0400 (0:00:00.784) 0:05:59.154 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:398 Saturday 12 October 2024 20:56:04 -0400 (0:00:00.070) 0:05:59.225 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:56:05 -0400 (0:00:00.103) 0:05:59.328 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:56:05 -0400 (0:00:00.072) 0:05:59.401 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:56:05 -0400 (0:00:00.101) 0:05:59.503 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "4780f917-80bf-4089-913a-4e99164f31be" }, "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "size": "4G", "type": "crypt", "uuid": "7eb1e6a9-0886-471f-968a-605a5b64e85c" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:56:05 -0400 (0:00:00.673) 0:06:00.176 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002811", "end": "2024-10-12 20:56:06.199305", "rc": 0, "start": "2024-10-12 20:56:06.196494" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:56:06 -0400 (0:00:00.419) 0:06:00.595 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002475", "end": "2024-10-12 20:56:06.713505", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:56:06.711030" } STDOUT: luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:56:06 -0400 (0:00:00.502) 0:06:01.098 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:56:06 -0400 (0:00:00.166) 0:06:01.264 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:56:07 -0400 (0:00:00.054) 0:06:01.318 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017472", "end": "2024-10-12 20:56:07.354636", "rc": 0, "start": "2024-10-12 20:56:07.337164" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:56:07 -0400 (0:00:00.454) 0:06:01.772 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:56:07 -0400 (0:00:00.068) 0:06:01.841 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:56:07 -0400 (0:00:00.222) 0:06:02.063 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:56:07 -0400 (0:00:00.094) 0:06:02.157 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:56:08 -0400 (0:00:00.436) 0:06:02.594 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:56:08 -0400 (0:00:00.129) 0:06:02.724 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:56:08 -0400 (0:00:00.140) 0:06:02.864 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:56:08 -0400 (0:00:00.194) 0:06:03.059 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:56:08 -0400 (0:00:00.121) 0:06:03.180 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.125) 0:06:03.306 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.054) 0:06:03.361 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.097) 0:06:03.459 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.327) 0:06:03.786 ****** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.063) 0:06:03.850 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.115) 0:06:03.966 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.054) 0:06:04.021 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.051) 0:06:04.072 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.062) 0:06:04.135 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.057) 0:06:04.192 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:56:09 -0400 (0:00:00.055) 0:06:04.248 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.062) 0:06:04.310 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.063) 0:06:04.374 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.055) 0:06:04.430 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.071) 0:06:04.501 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.054) 0:06:04.555 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.099) 0:06:04.655 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.225) 0:06:04.880 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.208) 0:06:05.089 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.069) 0:06:05.158 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.065) 0:06:05.223 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 12 October 2024 20:56:10 -0400 (0:00:00.052) 0:06:05.276 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.052) 0:06:05.329 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.051) 0:06:05.381 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.102) 0:06:05.483 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.109) 0:06:05.593 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.218) 0:06:05.811 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.141) 0:06:05.953 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.056) 0:06:06.010 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.052) 0:06:06.063 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.064) 0:06:06.127 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:56:11 -0400 (0:00:00.064) 0:06:06.191 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.133) 0:06:06.325 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.063) 0:06:06.388 ****** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.059) 0:06:06.448 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.106) 0:06:06.554 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.083) 0:06:06.637 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.092) 0:06:06.730 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.088) 0:06:06.818 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.057) 0:06:06.876 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.082) 0:06:06.958 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.094) 0:06:07.053 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.053) 0:06:07.107 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:56:12 -0400 (0:00:00.147) 0:06:07.254 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.119) 0:06:07.374 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.132) 0:06:07.506 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.056) 0:06:07.563 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.055) 0:06:07.618 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.070) 0:06:07.689 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.053) 0:06:07.742 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.087) 0:06:07.830 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.100) 0:06:07.931 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.261) 0:06:08.192 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:56:13 -0400 (0:00:00.074) 0:06:08.267 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.107) 0:06:08.374 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.055) 0:06:08.430 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.077) 0:06:08.508 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.063) 0:06:08.572 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.053) 0:06:08.626 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.057) 0:06:08.683 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.170) 0:06:08.853 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.074) 0:06:08.928 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:56:14 -0400 (0:00:00.317) 0:06:09.245 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.096) 0:06:09.342 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.126) 0:06:09.468 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.117) 0:06:09.586 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.163) 0:06:09.750 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.092) 0:06:09.842 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.114) 0:06:09.957 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.098) 0:06:10.055 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.057) 0:06:10.112 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:56:15 -0400 (0:00:00.076) 0:06:10.189 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.135) 0:06:10.324 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.110) 0:06:10.435 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.092) 0:06:10.528 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.081) 0:06:10.609 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.077) 0:06:10.687 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.054) 0:06:10.741 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.058) 0:06:10.800 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.053) 0:06:10.854 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.071) 0:06:10.925 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:56:16 -0400 (0:00:00.071) 0:06:10.997 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780940.7120388, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780924.5489616, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 364935, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780924.5489616, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.373) 0:06:11.370 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.073) 0:06:11.443 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.075) 0:06:11.518 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.068) 0:06:11.586 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.074) 0:06:11.661 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.083) 0:06:11.745 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:56:17 -0400 (0:00:00.075) 0:06:11.821 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780924.660962, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728780924.660962, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 364998, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728780924.660962, "nlink": 1, "path": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:56:18 -0400 (0:00:00.531) 0:06:12.352 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:56:18 -0400 (0:00:00.789) 0:06:13.142 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.025461", "end": "2024-10-12 20:56:19.155159", "rc": 0, "start": "2024-10-12 20:56:19.129698" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: d7 80 0e 05 a6 b7 4f de 32 a1 f1 69 56 0c de 9e a8 bb 28 15 MK salt: 73 68 87 5a c4 31 b4 41 b4 ec 16 ba d3 96 21 4d 3d 70 73 10 7a 38 a8 27 d7 65 c3 00 45 9d 7b 07 MK iterations: 23988 UUID: 4780f917-80bf-4089-913a-4e99164f31be Key Slot 0: ENABLED Iterations: 383812 Salt: 73 84 e2 91 25 74 b3 68 81 e2 f5 1d df 20 7e 5e 95 86 c3 2c 6d 1a 01 d8 76 f1 60 67 69 88 fb d7 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.466) 0:06:13.609 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.084) 0:06:13.694 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.082) 0:06:13.776 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.075) 0:06:13.852 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.063) 0:06:13.915 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.055) 0:06:13.971 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.037) 0:06:14.008 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.037) 0:06:14.046 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.044) 0:06:14.091 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.041) 0:06:14.132 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.043) 0:06:14.175 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.047) 0:06:14.223 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:56:19 -0400 (0:00:00.057) 0:06:14.280 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.057) 0:06:14.337 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.060) 0:06:14.397 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.055) 0:06:14.453 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.053) 0:06:14.507 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.077) 0:06:14.585 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.087) 0:06:14.673 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.097) 0:06:14.770 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.110) 0:06:14.880 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.063) 0:06:14.943 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.061) 0:06:15.005 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:56:20 -0400 (0:00:00.071) 0:06:15.077 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:56:21 -0400 (0:00:00.421) 0:06:15.498 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:56:21 -0400 (0:00:00.468) 0:06:15.966 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:56:21 -0400 (0:00:00.095) 0:06:16.062 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:56:21 -0400 (0:00:00.095) 0:06:16.157 ****** ok: [managed-node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.420) 0:06:16.578 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.080) 0:06:16.658 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.057) 0:06:16.715 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.052) 0:06:16.768 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.060) 0:06:16.828 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.052) 0:06:16.881 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.056) 0:06:16.937 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.078) 0:06:17.016 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.077) 0:06:17.093 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.079) 0:06:17.172 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.051) 0:06:17.224 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:56:22 -0400 (0:00:00.050) 0:06:17.274 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.053) 0:06:17.328 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.046) 0:06:17.374 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.036) 0:06:17.411 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.036) 0:06:17.448 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.043) 0:06:17.492 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.052) 0:06:17.545 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.053) 0:06:17.598 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.057) 0:06:17.656 ****** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.069) 0:06:17.725 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.059) 0:06:17.785 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:56:23 -0400 (0:00:00.080) 0:06:17.866 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.017378", "end": "2024-10-12 20:56:24.080695", "rc": 0, "start": "2024-10-12 20:56:24.063317" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.602) 0:06:18.469 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.083) 0:06:18.552 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.087) 0:06:18.639 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.057) 0:06:18.697 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.056) 0:06:18.753 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.056) 0:06:18.810 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.060) 0:06:18.871 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.054) 0:06:18.925 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.054) 0:06:18.980 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 12 October 2024 20:56:24 -0400 (0:00:00.054) 0:06:19.034 ****** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:404 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.402) 0:06:19.437 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.115) 0:06:19.552 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.069) 0:06:19.622 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.101) 0:06:19.723 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.209) 0:06:19.933 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.065) 0:06:19.998 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.185) 0:06:20.183 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:56:25 -0400 (0:00:00.102) 0:06:20.286 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:56:26 -0400 (0:00:00.120) 0:06:20.406 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:56:26 -0400 (0:00:00.109) 0:06:20.515 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:56:26 -0400 (0:00:00.065) 0:06:20.581 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:56:26 -0400 (0:00:00.182) 0:06:20.763 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:56:28 -0400 (0:00:01.651) 0:06:22.414 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:56:28 -0400 (0:00:00.075) 0:06:22.490 ****** ok: [managed-node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:56:28 -0400 (0:00:00.073) 0:06:22.563 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:56:32 -0400 (0:00:03.926) 0:06:26.489 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:56:32 -0400 (0:00:00.105) 0:06:26.595 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:56:32 -0400 (0:00:00.051) 0:06:26.646 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:56:32 -0400 (0:00:00.054) 0:06:26.700 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:56:32 -0400 (0:00:00.052) 0:06:26.752 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:56:33 -0400 (0:00:00.715) 0:06:27.467 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service": { "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:56:34 -0400 (0:00:00.932) 0:06:28.400 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:56:34 -0400 (0:00:00.052) 0:06:28.452 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d4780f917\x2d80bf\x2d4089\x2d913a\x2d4e99164f31be.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service system-systemd\\x2dcryptsetup.slice cryptsetup-pre.target systemd-journald.socket dev-mapper-foo\\x2dtest1.device systemd-readahead-replay.service", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-4780f917-80bf-4089-913a-4e99164f31be", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4780f917-80bf-4089-913a-4e99164f31be ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:56:34 -0400 (0:00:00.460) 0:06:28.913 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'luks-4780f917-80bf-4089-913a-4e99164f31be' in safe mode due to encryption removal TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:56:38 -0400 (0:00:04.183) 0:06:33.096 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': False, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': 0, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': u'luks1', u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'luks-4780f917-80bf-4089-913a-4e99164f31be' in safe mode due to encryption removal"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:56:38 -0400 (0:00:00.066) 0:06:33.162 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d4780f917\x2d80bf\x2d4089\x2d913a\x2d4e99164f31be.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:56:39 -0400 (0:00:00.706) 0:06:33.868 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:56:39 -0400 (0:00:00.065) 0:06:33.934 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:56:39 -0400 (0:00:00.072) 0:06:34.006 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 12 October 2024 20:56:39 -0400 (0:00:00.065) 0:06:34.072 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780985.040253, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728780985.040253, "dev": 64769, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1728780985.040253, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "676903701", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.370) 0:06:34.442 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Remove the encryption layer] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:427 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.087) 0:06:34.530 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.194) 0:06:34.725 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.098) 0:06:34.823 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.057) 0:06:34.881 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.280) 0:06:35.161 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.053) 0:06:35.215 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:56:40 -0400 (0:00:00.065) 0:06:35.280 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:56:41 -0400 (0:00:00.050) 0:06:35.331 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:56:41 -0400 (0:00:00.050) 0:06:35.381 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:56:41 -0400 (0:00:00.104) 0:06:35.486 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:56:42 -0400 (0:00:01.395) 0:06:36.882 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": false, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:56:42 -0400 (0:00:00.072) 0:06:36.954 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:56:42 -0400 (0:00:00.078) 0:06:37.033 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:56:46 -0400 (0:00:04.192) 0:06:41.225 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:56:47 -0400 (0:00:00.119) 0:06:41.345 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:56:47 -0400 (0:00:00.059) 0:06:41.405 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:56:47 -0400 (0:00:00.095) 0:06:41.501 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:56:47 -0400 (0:00:00.071) 0:06:41.572 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:56:48 -0400 (0:00:00.885) 0:06:42.457 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service": { "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:56:49 -0400 (0:00:01.321) 0:06:43.778 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:56:49 -0400 (0:00:00.104) 0:06:43.882 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d4780f917\x2d80bf\x2d4089\x2d913a\x2d4e99164f31be.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "system-systemd\\x2dcryptsetup.slice systemd-readahead-collect.service dev-mapper-foo\\x2dtest1.device systemd-readahead-replay.service cryptsetup-pre.target systemd-journald.socket", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "cryptsetup.target umount.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-4780f917-80bf-4089-913a-4e99164f31be", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4780f917-80bf-4089-913a-4e99164f31be ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "dev-mapper-luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.device cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:56:50 -0400 (0:00:00.656) 0:06:44.539 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4780f917-80bf-4089-913a-4e99164f31be", "password": "-", "state": "absent" } ], "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:57:55 -0400 (0:01:05.024) 0:07:49.563 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:57:55 -0400 (0:00:00.071) 0:07:49.635 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780927.796977, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "808db3331cf2470840def9e89b6b0f18940090c3", "ctime": 1728780927.793977, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728780927.793977, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:57:55 -0400 (0:00:00.470) 0:07:50.106 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:57:56 -0400 (0:00:00.476) 0:07:50.582 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d4780f917\x2d80bf\x2d4089\x2d913a\x2d4e99164f31be.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "RequiredBy": "cryptsetup.target dev-mapper-luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.device", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:57:56 -0400 (0:00:00.685) 0:07:51.267 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4780f917-80bf-4089-913a-4e99164f31be", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/mapper/foo-test1", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" } ], "packages": [ "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:57:57 -0400 (0:00:00.119) 0:07:51.387 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:57:57 -0400 (0:00:00.070) 0:07:51.457 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:57:57 -0400 (0:00:00.070) 0:07:51.528 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-4780f917-80bf-4089-913a-4e99164f31be" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:57:57 -0400 (0:00:00.403) 0:07:51.931 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:57:58 -0400 (0:00:00.511) 0:07:52.443 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:57:58 -0400 (0:00:00.470) 0:07:52.914 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/foo-test1', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:57:58 -0400 (0:00:00.072) 0:07:52.986 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:57:59 -0400 (0:00:00.743) 0:07:53.730 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728780931.6159952, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "9401888b8c170cf9ab1edb257581ae05abda223a", "ctime": 1728780929.7469864, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728780929.7469864, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1741554872", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:58:00 -0400 (0:00:00.577) 0:07:54.308 ****** changed: [managed-node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-4780f917-80bf-4089-913a-4e99164f31be', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-4780f917-80bf-4089-913a-4e99164f31be", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:58:00 -0400 (0:00:00.336) 0:07:54.644 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:443 Saturday 12 October 2024 20:58:01 -0400 (0:00:00.682) 0:07:55.327 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:58:01 -0400 (0:00:00.079) 0:07:55.406 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/foo-test1", "_kernel_device": "/dev/dm-0", "_mount_id": "/dev/mapper/foo-test1", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": 0, "encryption_luks_version": "luks1", "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:58:01 -0400 (0:00:00.046) 0:07:55.453 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:58:01 -0400 (0:00:00.033) 0:07:55.486 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "a78bc64c-9e92-44bb-9bba-5d9cb28a715d" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:58:01 -0400 (0:00:00.330) 0:07:55.816 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002926", "end": "2024-10-12 20:58:01.789075", "rc": 0, "start": "2024-10-12 20:58:01.786149" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/foo-test1 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:58:01 -0400 (0:00:00.333) 0:07:56.150 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002845", "end": "2024-10-12 20:58:02.097970", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:58:02.095125" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.324) 0:07:56.474 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.091) 0:07:56.565 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.036) 0:07:56.602 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.018766", "end": "2024-10-12 20:58:02.593728", "rc": 0, "start": "2024-10-12 20:58:02.574962" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.379) 0:07:56.981 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.071) 0:07:57.053 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.122) 0:07:57.176 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:58:02 -0400 (0:00:00.060) 0:07:57.237 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.341) 0:07:57.578 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.055) 0:07:57.633 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.058) 0:07:57.691 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.053) 0:07:57.745 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.043) 0:07:57.788 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.042) 0:07:57.831 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.035) 0:07:57.866 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.051) 0:07:57.917 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.229) 0:07:58.147 ****** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:58:03 -0400 (0:00:00.058) 0:07:58.205 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.131) 0:07:58.337 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.076) 0:07:58.414 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.085) 0:07:58.499 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.055) 0:07:58.555 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.062) 0:07:58.617 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.057) 0:07:58.674 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.064) 0:07:58.738 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.050) 0:07:58.789 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.111) 0:07:58.900 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.055) 0:07:58.955 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.054) 0:07:59.010 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.054) 0:07:59.064 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:58:04 -0400 (0:00:00.117) 0:07:59.181 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.128) 0:07:59.310 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.084) 0:07:59.394 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.051) 0:07:59.445 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.077) 0:07:59.523 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.084) 0:07:59.607 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.050) 0:07:59.658 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.045) 0:07:59.703 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.035) 0:07:59.739 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.077) 0:07:59.817 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.102) 0:07:59.919 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.053) 0:07:59.972 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.077) 0:08:00.049 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.072) 0:08:00.122 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:58:05 -0400 (0:00:00.080) 0:08:00.202 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.146) 0:08:00.348 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.061) 0:08:00.410 ****** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.067) 0:08:00.478 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.197) 0:08:00.675 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.061) 0:08:00.737 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.091) 0:08:00.829 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.052) 0:08:00.881 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.062) 0:08:00.944 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.054) 0:08:00.999 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.060) 0:08:01.059 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.065) 0:08:01.124 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:58:06 -0400 (0:00:00.139) 0:08:01.264 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.125) 0:08:01.390 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.070) 0:08:01.460 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.072) 0:08:01.532 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.057) 0:08:01.590 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.053) 0:08:01.643 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.056) 0:08:01.699 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.058) 0:08:01.758 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.045) 0:08:01.803 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.105) 0:08:01.908 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.052) 0:08:01.961 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.047) 0:08:02.009 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.045) 0:08:02.055 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.040) 0:08:02.096 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.046) 0:08:02.142 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.042) 0:08:02.185 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.034) 0:08:02.220 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:58:07 -0400 (0:00:00.070) 0:08:02.290 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.098) 0:08:02.389 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.171) 0:08:02.560 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/foo-test1" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.041) 0:08:02.601 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.055) 0:08:02.656 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.054) 0:08:02.711 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.087) 0:08:02.799 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.063) 0:08:02.863 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.051) 0:08:02.914 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.050) 0:08:02.965 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.043) 0:08:03.009 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.042) 0:08:03.051 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.047) 0:08:03.099 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.043) 0:08:03.142 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/foo-test1 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.062) 0:08:03.205 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:58:08 -0400 (0:00:00.044) 0:08:03.250 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.054) 0:08:03.304 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.051) 0:08:03.356 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.055) 0:08:03.411 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.047) 0:08:03.459 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.056) 0:08:03.515 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.050) 0:08:03.565 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781075.1296837, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728781075.1296837, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 385962, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728781075.1296837, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.327) 0:08:03.893 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.044) 0:08:03.937 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.041) 0:08:03.979 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.062) 0:08:04.041 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.060) 0:08:04.102 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.056) 0:08:04.158 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.063) 0:08:04.222 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:58:09 -0400 (0:00:00.051) 0:08:04.273 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.690) 0:08:04.964 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.054) 0:08:05.019 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.052) 0:08:05.072 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.070) 0:08:05.142 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.054) 0:08:05.197 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.051) 0:08:05.249 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:58:10 -0400 (0:00:00.050) 0:08:05.299 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.050) 0:08:05.350 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.051) 0:08:05.402 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.064) 0:08:05.467 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.054) 0:08:05.521 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.041) 0:08:05.562 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.045) 0:08:05.608 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.040) 0:08:05.649 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.034) 0:08:05.684 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.033) 0:08:05.717 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.035) 0:08:05.753 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.040) 0:08:05.793 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.049) 0:08:05.843 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.051) 0:08:05.895 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.061) 0:08:05.956 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.062) 0:08:06.018 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.057) 0:08:06.076 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.132) 0:08:06.208 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:58:11 -0400 (0:00:00.059) 0:08:06.268 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:58:12 -0400 (0:00:00.445) 0:08:06.713 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:58:12 -0400 (0:00:00.362) 0:08:07.076 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:58:12 -0400 (0:00:00.075) 0:08:07.152 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:58:12 -0400 (0:00:00.060) 0:08:07.212 ****** ok: [managed-node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.551) 0:08:07.763 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.067) 0:08:07.831 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.068) 0:08:07.900 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.069) 0:08:07.970 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.061) 0:08:08.031 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.053) 0:08:08.084 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.059) 0:08:08.144 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.062) 0:08:08.207 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:58:13 -0400 (0:00:00.047) 0:08:08.254 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.045) 0:08:08.300 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.045) 0:08:08.345 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.049) 0:08:08.395 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.056) 0:08:08.451 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.053) 0:08:08.505 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.051) 0:08:08.557 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.061) 0:08:08.619 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.080) 0:08:08.699 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.074) 0:08:08.774 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.054) 0:08:08.828 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.055) 0:08:08.884 ****** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.072) 0:08:08.957 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.057) 0:08:09.015 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:58:14 -0400 (0:00:00.071) 0:08:09.086 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.017431", "end": "2024-10-12 20:58:15.109640", "rc": 0, "start": "2024-10-12 20:58:15.092209" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.413) 0:08:09.500 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.058) 0:08:09.559 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.062) 0:08:09.621 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.054) 0:08:09.676 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.056) 0:08:09.733 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.068) 0:08:09.801 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.056) 0:08:09.858 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.046) 0:08:09.905 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.043) 0:08:09.948 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Create a file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/create-test-file.yml:12 Saturday 12 October 2024 20:58:15 -0400 (0:00:00.046) 0:08:09.995 ****** changed: [managed-node2] => { "changed": true, "dest": "/opt/test1/quux", "gid": 0, "group": "root", "mode": "0644", "owner": "root", "secontext": "unconfined_u:object_r:unlabeled_t:s0", "size": 0, "state": "file", "uid": 0 } TASK [Test for correct handling of safe_mode] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:449 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.379) 0:08:10.375 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml for managed-node2 TASK [Store global variable value copy] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:4 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.137) 0:08:10.512 ****** ok: [managed-node2] => { "ansible_facts": { "storage_pools_global": [], "storage_safe_mode_global": true, "storage_volumes_global": [] }, "changed": false } TASK [Verify role raises correct error] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:10 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.062) 0:08:10.574 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.087) 0:08:10.662 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.111) 0:08:10.773 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.069) 0:08:10.843 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.148) 0:08:10.991 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.062) 0:08:11.054 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.051) 0:08:11.105 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.051) 0:08:11.156 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:58:16 -0400 (0:00:00.064) 0:08:11.221 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:58:17 -0400 (0:00:00.140) 0:08:11.362 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:58:18 -0400 (0:00:01.513) 0:08:12.875 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:58:18 -0400 (0:00:00.101) 0:08:12.976 ****** ok: [managed-node2] => { "storage_volumes": [] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:58:18 -0400 (0:00:00.068) 0:08:13.045 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:58:22 -0400 (0:00:04.000) 0:08:17.045 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:58:22 -0400 (0:00:00.097) 0:08:17.143 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:58:22 -0400 (0:00:00.048) 0:08:17.191 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:58:22 -0400 (0:00:00.053) 0:08:17.244 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:58:22 -0400 (0:00:00.049) 0:08:17.294 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:58:23 -0400 (0:00:00.753) 0:08:18.047 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service": { "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:58:24 -0400 (0:00:01.100) 0:08:19.148 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [ "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service" ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:58:24 -0400 (0:00:00.088) 0:08:19.237 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d4780f917\x2d80bf\x2d4089\x2d913a\x2d4e99164f31be.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "After": "systemd-readahead-collect.service systemd-readahead-replay.service dev-mapper-foo\\x2dtest1.device systemd-journald.socket cryptsetup-pre.target system-systemd\\x2dcryptsetup.slice", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "Before": "umount.target cryptsetup.target", "BindsTo": "dev-mapper-foo\\x2dtest1.device", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "Conflicts": "umount.target", "ControlPID": "0", "DefaultDependencies": "no", "Delegate": "no", "Description": "Cryptography Setup for luks-4780f917-80bf-4089-913a-4e99164f31be", "DevicePolicy": "auto", "Documentation": "man:crypttab(5) man:systemd-cryptsetup-generator(8) man:systemd-cryptsetup@.service(8)", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup attach luks-4780f917-80bf-4089-913a-4e99164f31be /dev/mapper/foo-test1 - ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStop": "{ path=/usr/lib/systemd/systemd-cryptsetup ; argv[]=/usr/lib/systemd/systemd-cryptsetup detach luks-4780f917-80bf-4089-913a-4e99164f31be ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/run/systemd/generator/systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "IgnoreOnIsolate": "yes", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "4096", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "loaded", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "NeedDaemonReload": "yes", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "yes", "RequiredBy": "cryptsetup.target", "Requires": "system-systemd\\x2dcryptsetup.slice", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "Slice": "system-systemd\\x2dcryptsetup.slice", "SourcePath": "/etc/crypttab", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "journal", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "0", "TimeoutStopUSec": "0", "TimerSlackNSec": "50000", "Transient": "no", "Type": "oneshot", "UMask": "0022", "UnitFilePreset": "disabled", "UnitFileState": "bad", "WantedBy": "dev-mapper-foo\\x2dtest1.device", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:58:25 -0400 (0:00:00.583) 0:08:19.820 ****** fatal: [managed-node2]: FAILED! => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } MSG: cannot remove existing formatting on device 'test1' in safe mode due to adding encryption TASK [fedora.linux_system_roles.storage : Failed message] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:109 Saturday 12 October 2024 20:58:29 -0400 (0:00:04.210) 0:08:24.031 ****** fatal: [managed-node2]: FAILED! => { "changed": false } MSG: {u'_ansible_no_log': False, u'crypts': [], u'pools': [], u'leaves': [], u'changed': False, u'actions': [], u'failed': True, u'volumes': [], u'invocation': {u'module_args': {u'packages_only': False, u'disklabel_type': None, u'diskvolume_mkfs_option_map': {u'ext4': u'-F', u'ext3': u'-F', u'ext2': u'-F'}, u'safe_mode': True, u'pools': [{u'raid_metadata_version': None, u'encryption_key_size': None, u'encryption_key': None, u'encryption_luks_version': None, u'encryption_tang_url': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_tang_thumbprint': None, u'name': u'foo', u'encryption_password': None, u'encryption': False, u'disks': [u'sda'], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [{u'raid_metadata_version': None, u'mount_device_identifier': u'uuid', u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': u'4g', u'mount_point': u'/opt/test1', u'compression': None, u'encryption_password': u'VALUE_SPECIFIED_IN_NO_LOG_PARAMETER', u'encryption': True, u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'mount_mode': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'deduplication': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'mount_passno': 0, u'mount_user': None, u'raid_spare_count': None, u'name': u'test1', u'cache_mode': None, u'raid_disks': [], u'mount_group': None, u'fs_overwrite_existing': True, u'disks': [u'sda'], u'cached': False, u'thin_pool_size': None, u'thin': False, u'mount_check': 0, u'cache_size': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}], u'shared': False, u'encryption_clevis_pin': None, u'type': u'lvm', u'encryption_cipher': None, u'raid_chunk_size': None}], u'volumes': [], u'pool_defaults': {u'raid_metadata_version': None, u'encryption_cipher': None, u'encryption_key': None, u'encryption_luks_version': None, u'raid_spare_count': None, u'grow_to_fill': False, u'encryption_password': None, u'encryption': False, u'disks': [], u'raid_level': None, u'raid_device_count': None, u'state': u'present', u'volumes': [], u'shared': False, u'type': u'lvm', u'encryption_key_size': None, u'raid_chunk_size': None}, u'volume_defaults': {u'raid_metadata_version': None, u'raid_level': None, u'fs_type': u'xfs', u'mount_options': u'defaults', u'size': 0, u'mount_point': u'', u'compression': None, u'encryption_password': None, u'encryption': False, u'mount_device_identifier': u'uuid', u'raid_device_count': None, u'state': u'present', u'vdo_pool_size': None, u'thin_pool_name': None, u'type': u'lvm', u'encryption_key_size': None, u'encryption_cipher': None, u'encryption_key': None, u'fs_label': u'', u'encryption_luks_version': None, u'raid_stripe_size': None, u'cache_size': 0, u'raid_spare_count': None, u'cache_mode': None, u'deduplication': None, u'cached': False, u'fs_overwrite_existing': True, u'disks': [], u'thin_pool_size': None, u'thin': None, u'mount_check': 0, u'mount_passno': 0, u'raid_chunk_size': None, u'cache_devices': [], u'fs_create_options': u''}, u'use_partitions': None}}, u'mounts': [], u'packages': [], u'msg': u"cannot remove existing formatting on device 'test1' in safe mode due to adding encryption"} TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:58:29 -0400 (0:00:00.114) 0:08:24.145 ****** changed: [managed-node2] => (item=systemd-cryptsetup@luks\x2d4780f917\x2d80bf\x2d4089\x2d913a\x2d4e99164f31be.service) => { "ansible_loop_var": "item", "changed": true, "item": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "name": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "status": { "ActiveEnterTimestampMonotonic": "0", "ActiveExitTimestampMonotonic": "0", "ActiveState": "inactive", "AllowIsolate": "no", "AmbientCapabilities": "0", "AssertResult": "no", "AssertTimestampMonotonic": "0", "BlockIOAccounting": "no", "BlockIOWeight": "18446744073709551615", "CPUAccounting": "no", "CPUQuotaPerSecUSec": "infinity", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUSchedulingResetOnFork": "no", "CPUShares": "18446744073709551615", "CanIsolate": "no", "CanReload": "no", "CanStart": "yes", "CanStop": "yes", "CapabilityBoundingSet": "18446744073709551615", "CollectMode": "inactive", "ConditionResult": "no", "ConditionTimestampMonotonic": "0", "ControlPID": "0", "DefaultDependencies": "yes", "Delegate": "no", "Description": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "DevicePolicy": "auto", "ExecMainCode": "0", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "0", "ExecMainStartTimestampMonotonic": "0", "ExecMainStatus": "0", "FailureAction": "none", "FileDescriptorStoreMax": "0", "FragmentPath": "/dev/null", "GuessMainPID": "yes", "IOScheduling": "0", "Id": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "IgnoreOnIsolate": "no", "IgnoreOnSnapshot": "no", "IgnoreSIGPIPE": "yes", "InactiveEnterTimestampMonotonic": "0", "InactiveExitTimestampMonotonic": "0", "JobTimeoutAction": "none", "JobTimeoutUSec": "0", "KillMode": "control-group", "KillSignal": "15", "LimitAS": "18446744073709551615", "LimitCORE": "18446744073709551615", "LimitCPU": "18446744073709551615", "LimitDATA": "18446744073709551615", "LimitFSIZE": "18446744073709551615", "LimitLOCKS": "18446744073709551615", "LimitMEMLOCK": "65536", "LimitMSGQUEUE": "819200", "LimitNICE": "0", "LimitNOFILE": "1048576", "LimitNPROC": "14311", "LimitRSS": "18446744073709551615", "LimitRTPRIO": "0", "LimitRTTIME": "18446744073709551615", "LimitSIGPENDING": "14311", "LimitSTACK": "18446744073709551615", "LoadState": "masked", "MainPID": "0", "MemoryAccounting": "no", "MemoryCurrent": "18446744073709551615", "MemoryLimit": "18446744073709551615", "MountFlags": "0", "Names": "systemd-cryptsetup@luks\\x2d4780f917\\x2d80bf\\x2d4089\\x2d913a\\x2d4e99164f31be.service", "NeedDaemonReload": "no", "Nice": "0", "NoNewPrivileges": "no", "NonBlocking": "no", "NotifyAccess": "none", "OOMScoreAdjust": "0", "OnFailureJobMode": "replace", "PermissionsStartOnly": "no", "PrivateDevices": "no", "PrivateNetwork": "no", "PrivateTmp": "no", "ProtectHome": "no", "ProtectSystem": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "RemainAfterExit": "no", "Restart": "no", "RestartUSec": "100ms", "Result": "success", "RootDirectoryStartOnly": "no", "RuntimeDirectoryMode": "0755", "SameProcessGroup": "no", "SecureBits": "0", "SendSIGHUP": "no", "SendSIGKILL": "yes", "StandardError": "inherit", "StandardInput": "null", "StandardOutput": "inherit", "StartLimitAction": "none", "StartLimitBurst": "5", "StartLimitInterval": "10000000", "StartupBlockIOWeight": "18446744073709551615", "StartupCPUShares": "18446744073709551615", "StatusErrno": "0", "StopWhenUnneeded": "no", "SubState": "dead", "SyslogLevelPrefix": "yes", "SyslogPriority": "30", "SystemCallErrorNumber": "0", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "TasksAccounting": "no", "TasksCurrent": "18446744073709551615", "TasksMax": "18446744073709551615", "TimeoutStartUSec": "1min 30s", "TimeoutStopUSec": "1min 30s", "TimerSlackNSec": "50000", "Transient": "no", "UMask": "0022", "UnitFileState": "bad", "WatchdogTimestampMonotonic": "0", "WatchdogUSec": "0" } } TASK [Check that we failed in the role] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:23 Saturday 12 October 2024 20:58:30 -0400 (0:00:00.593) 0:08:24.739 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the blivet output and error message are correct] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:28 Saturday 12 October 2024 20:58:30 -0400 (0:00:00.088) 0:08:24.827 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify correct exception or error message] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-failed.yml:39 Saturday 12 October 2024 20:58:30 -0400 (0:00:00.094) 0:08:24.921 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the file] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:11 Saturday 12 October 2024 20:58:30 -0400 (0:00:00.083) 0:08:25.005 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781095.9737835, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728781095.9737835, "dev": 64768, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 67, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0644", "mtime": 1728781095.9737835, "nlink": 1, "path": "/opt/test1/quux", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "version": "18446744071862408785", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Assert file presence] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-data-preservation.yml:16 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.552) 0:08:25.557 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Add encryption to the volume] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:472 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.077) 0:08:25.635 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.201) 0:08:25.837 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.103) 0:08:25.940 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.065) 0:08:26.006 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.112) 0:08:26.118 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.041) 0:08:26.160 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.046) 0:08:26.207 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:58:31 -0400 (0:00:00.051) 0:08:26.259 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:58:32 -0400 (0:00:00.052) 0:08:26.312 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:58:32 -0400 (0:00:00.154) 0:08:26.466 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:58:33 -0400 (0:00:01.620) 0:08:28.087 ****** ok: [managed-node2] => { "storage_pools": [ { "disks": [ "sda" ], "name": "foo", "type": "lvm", "volumes": [ { "encryption": true, "encryption_password": "yabbadabbadoo", "mount_point": "/opt/test1", "name": "test1", "size": "4g" } ] } ] } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:58:33 -0400 (0:00:00.103) 0:08:28.190 ****** ok: [managed-node2] => { "storage_volumes": "VARIABLE IS NOT DEFINED!: 'storage_volumes' is undefined" } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:58:33 -0400 (0:00:00.075) 0:08:28.266 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [ "cryptsetup", "lvm2" ], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:58:38 -0400 (0:00:04.222) 0:08:32.488 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:58:38 -0400 (0:00:00.067) 0:08:32.556 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:58:38 -0400 (0:00:00.044) 0:08:32.600 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:58:38 -0400 (0:00:00.051) 0:08:32.651 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:58:38 -0400 (0:00:00.050) 0:08:32.702 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed", "7:lvm2-2.02.187-6.el7_9.5.x86_64 providing lvm2 is already installed", "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: cryptsetup kpartx lvm2 TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:58:39 -0400 (0:00:01.027) 0:08:33.729 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:58:40 -0400 (0:00:01.030) 0:08:34.760 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:58:40 -0400 (0:00:00.079) 0:08:34.840 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:58:40 -0400 (0:00:00.051) 0:08:34.891 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "password": "-", "state": "present" } ], "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:58:51 -0400 (0:00:10.736) 0:08:45.628 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:58:51 -0400 (0:00:00.124) 0:08:45.752 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781078.4486997, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "3fceedeef6c619b69ada96279531b69ed89734ba", "ctime": 1728781078.4456997, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728781078.4456997, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1279, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:58:51 -0400 (0:00:00.547) 0:08:46.300 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:58:52 -0400 (0:00:00.389) 0:08:46.690 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:58:52 -0400 (0:00:00.059) 0:08:46.749 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "xfs" }, { "action": "create format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "create device", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": null }, { "action": "create format", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": "xfs" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "password": "-", "state": "present" } ], "failed": false, "leaves": [ "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1", "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "mounted" } ], "packages": [ "cryptsetup", "xfsprogs", "e2fsprogs", "lvm2" ], "pools": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ], "volumes": [] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:58:52 -0400 (0:00:00.074) 0:08:46.824 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:58:52 -0400 (0:00:00.067) 0:08:46.892 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:58:52 -0400 (0:00:00.066) 0:08:46.959 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/foo-test1', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/foo-test1", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/foo-test1" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:58:53 -0400 (0:00:00.371) 0:08:47.331 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 20:58:53 -0400 (0:00:00.516) 0:08:47.848 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "mounted" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6" } TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 20:58:53 -0400 (0:00:00.393) 0:08:48.241 ****** skipping: [managed-node2] => (item={u'src': u'/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6', u'group': None, u'dump': 0, u'passno': 0, u'fstype': u'xfs', u'state': u'mounted', u'mode': None, u'owner': None, u'path': u'/opt/test1', u'opts': u'defaults'}) => { "ansible_loop_var": "mount_info", "changed": false, "mount_info": { "dump": 0, "fstype": "xfs", "group": null, "mode": null, "opts": "defaults", "owner": null, "passno": 0, "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "mounted" }, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 20:58:54 -0400 (0:00:00.065) 0:08:48.306 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 20:58:54 -0400 (0:00:00.490) 0:08:48.796 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781082.0967171, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 0, "charset": "binary", "checksum": "da39a3ee5e6b4b0d3255bfef95601890afd80709", "ctime": 1728781080.2747085, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917510, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "inode/x-empty", "mode": "0600", "mtime": 1728781080.2747085, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": "1741555237", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 20:58:54 -0400 (0:00:00.379) 0:08:49.176 ****** changed: [managed-node2] => (item={u'state': u'present', u'password': u'-', u'name': u'luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "password": "-", "state": "present" } } MSG: line added TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 20:58:55 -0400 (0:00:00.386) 0:08:49.563 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:488 Saturday 12 October 2024 20:58:56 -0400 (0:00:00.750) 0:08:50.313 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 20:58:56 -0400 (0:00:00.122) 0:08:50.435 ****** ok: [managed-node2] => { "_storage_pools_list": [ { "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_clevis_pin": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "encryption_tang_thumbprint": null, "encryption_tang_url": null, "grow_to_fill": false, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "shared": false, "state": "present", "type": "lvm", "volumes": [ { "_device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_kernel_device": "/dev/dm-1", "_mount_id": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "_raw_device": "/dev/mapper/foo-test1", "_raw_kernel_device": "/dev/dm-0", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": true, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": "VALUE_SPECIFIED_IN_NO_LOG_PARAMETER", "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "xfs", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": "/opt/test1", "mount_user": null, "name": "test1", "raid_chunk_size": null, "raid_device_count": null, "raid_disks": [], "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": "4g", "state": "present", "thin": false, "thin_pool_name": null, "thin_pool_size": null, "type": "lvm", "vdo_pool_size": null } ] } ] } TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 20:58:56 -0400 (0:00:00.069) 0:08:50.505 ****** skipping: [managed-node2] => {} TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 20:58:56 -0400 (0:00:00.132) 0:08:50.637 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/mapper/foo-test1": { "fstype": "crypto_LUKS", "label": "", "mountpoint": "", "name": "/dev/mapper/foo-test1", "size": "4G", "type": "lvm", "uuid": "d54b19e2-d700-4352-9ef0-ca3f826fceb6" }, "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6": { "fstype": "xfs", "label": "", "mountpoint": "/opt/test1", "name": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "size": "4G", "type": "crypt", "uuid": "bbf98e1c-d171-48ef-a83d-a859f2efaf57" }, "/dev/sda": { "fstype": "LVM2_member", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 20:58:56 -0400 (0:00:00.377) 0:08:51.015 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002866", "end": "2024-10-12 20:58:56.990680", "rc": 0, "start": "2024-10-12 20:58:56.987814" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 /dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6 /opt/test1 xfs defaults 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 20:58:57 -0400 (0:00:00.358) 0:08:51.374 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002811", "end": "2024-10-12 20:58:57.349288", "failed_when_result": false, "rc": 0, "start": "2024-10-12 20:58:57.346477" } STDOUT: luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6 /dev/mapper/foo-test1 - TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 20:58:57 -0400 (0:00:00.364) 0:08:51.738 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml for managed-node2 TASK [Set _storage_pool_tests] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:5 Saturday 12 October 2024 20:58:57 -0400 (0:00:00.118) 0:08:51.857 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pool_tests": [ "members", "volumes" ] }, "changed": false } TASK [Get VG shared value status] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:18 Saturday 12 October 2024 20:58:57 -0400 (0:00:00.059) 0:08:51.917 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "vgs", "--noheadings", "--binary", "-o", "shared", "foo" ], "delta": "0:00:00.017778", "end": "2024-10-12 20:58:57.949649", "rc": 0, "start": "2024-10-12 20:58:57.931871" } STDOUT: 0 TASK [Verify that VG shared value checks out] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:24 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.412) 0:08:52.329 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify pool subset] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool.yml:34 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.072) 0:08:52.402 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:2 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.119) 0:08:52.521 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_count": "1", "_storage_test_pool_pvs_lvm": [ "/dev/sda" ] }, "changed": false } TASK [Get the canonical device path for each member device] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:8 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.072) 0:08:52.593 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "device": "/dev/sda", "pv": "/dev/sda" } TASK [Set pvs lvm length] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:17 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.377) 0:08:52.971 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": "1" }, "changed": false } TASK [Set pool pvs] ************************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:22 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.059) 0:08:53.031 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_pool_pvs": [ "/dev/sda" ] }, "changed": false } TASK [Verify PV count] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:27 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.055) 0:08:53.086 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:36 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.054) 0:08:53.140 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:41 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.042) 0:08:53.182 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_pv_type": "disk" }, "changed": false } TASK [Set expected pv type] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:46 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.042) 0:08:53.225 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check the type of each PV] *********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:51 Saturday 12 October 2024 20:58:58 -0400 (0:00:00.040) 0:08:53.265 ****** ok: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "pv", "changed": false, "pv": "/dev/sda" } MSG: All assertions passed TASK [Check that blivet supports PV grow to fill] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:64 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.070) 0:08:53.336 ****** ok: [managed-node2] => { "changed": false, "rc": 0 } STDOUT: False STDERR: Shared connection to 10.31.45.43 closed. TASK [Verify that PVs fill the whole devices when they should] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:73 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.269) 0:08:53.605 ****** skipping: [managed-node2] => (item=/dev/sda) => { "ansible_loop_var": "st_pool_pv", "changed": false, "skip_reason": "Conditional result was False", "st_pool_pv": "/dev/sda" } TASK [Check MD RAID] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:83 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.064) 0:08:53.669 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml for managed-node2 TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:8 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.120) 0:08:53.789 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:14 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.058) 0:08:53.847 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:19 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.045) 0:08:53.893 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:24 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.044) 0:08:53.938 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md chunk size regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:29 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.042) 0:08:53.980 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:37 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.045) 0:08:54.025 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:46 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.037) 0:08:54.063 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:55 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.036) 0:08:54.100 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:64 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.033) 0:08:54.133 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:74 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.034) 0:08:54.168 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variables used by tests] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-md.yml:83 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.035) 0:08:54.204 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_md_active_devices_re": null, "storage_test_md_chunk_size_re": null, "storage_test_md_metadata_version_re": null, "storage_test_md_spare_devices_re": null }, "changed": false } TASK [Check LVM RAID] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:86 Saturday 12 October 2024 20:58:59 -0400 (0:00:00.035) 0:08:54.239 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml for managed-node2 TASK [Validate pool member LVM RAID settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-lvmraid.yml:2 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.090) 0:08:54.329 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml for managed-node2 TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:8 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.112) 0:08:54.442 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:16 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.055) 0:08:54.498 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:20 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.053) 0:08:54.552 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV stripe size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:27 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.051) 0:08:54.603 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested stripe size] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:31 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.053) 0:08:54.656 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected stripe size] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:37 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.052) 0:08:54.709 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check stripe size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-lvmraid.yml:42 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.052) 0:08:54.761 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check Thin Pools] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:89 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.051) 0:08:54.812 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml for managed-node2 TASK [Validate pool member thinpool settings] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-thin.yml:2 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.122) 0:08:54.935 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml for managed-node2 TASK [Get information about thinpool] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:8 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.114) 0:08:55.050 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in correct thinpool (when thinp name is provided)] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:16 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.072) 0:08:55.122 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check that volume is in thinpool (when thinp name is not provided)] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:22 Saturday 12 October 2024 20:59:00 -0400 (0:00:00.134) 0:08:55.256 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-thin.yml:26 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.055) 0:08:55.312 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_thin_status": null }, "changed": false } TASK [Check member encryption] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:92 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.055) 0:08:55.367 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml for managed-node2 TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:5 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.127) 0:08:55.494 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Validate pool member LUKS settings] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:10 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.068) 0:08:55.563 ****** skipping: [managed-node2] => (item=/dev/sda) => { "_storage_test_pool_member_path": "/dev/sda", "ansible_loop_var": "_storage_test_pool_member_path", "changed": false, "skip_reason": "Conditional result was False" } TASK [Validate pool member crypttab entries] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:17 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.065) 0:08:55.628 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml for managed-node2 TASK [Set variables used by tests] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:2 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.102) 0:08:55.731 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [] }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:6 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.054) 0:08:55.785 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:14 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.061) 0:08:55.847 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:23 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.040) 0:08:55.888 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:32 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.048) 0:08:55.936 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-crypttab.yml:41 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.052) 0:08:55.989 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null }, "changed": false } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-encryption.yml:24 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.053) 0:08:56.042 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_crypttab_key_file": null }, "changed": false } TASK [Check VDO] *************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:95 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.060) 0:08:56.102 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml for managed-node2 TASK [Validate pool member VDO settings] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-members-vdo.yml:2 Saturday 12 October 2024 20:59:01 -0400 (0:00:00.114) 0:08:56.217 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml for managed-node2 TASK [Get information about VDO deduplication] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:8 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.085) 0:08:56.302 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:15 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.039) 0:08:56.341 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:21 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.042) 0:08:56.384 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about VDO compression] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:27 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.055) 0:08:56.440 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is off] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:34 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.058) 0:08:56.499 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check if VDO deduplication is on] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:40 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.065) 0:08:56.565 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-member-vdo.yml:46 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.051) 0:08:56.616 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_vdo_status": null }, "changed": false } TASK [Check Stratis] *********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:98 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.049) 0:08:56.666 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml for managed-node2 TASK [Run 'stratis report'] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:6 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.109) 0:08:56.775 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about Stratis] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:11 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.037) 0:08:56.812 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the pools was created] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:15 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.036) 0:08:56.849 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that encryption is correctly set] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:25 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.049) 0:08:56.899 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that Clevis/Tang encryption is correctly set] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:34 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.053) 0:08:56.953 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Reset variable used by test] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-pool-stratis.yml:44 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.059) 0:08:57.012 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_stratis_report": null }, "changed": false } TASK [Clean up test variables] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-members.yml:101 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.056) 0:08:57.069 ****** ok: [managed-node2] => { "ansible_facts": { "__pvs_lvm_len": null, "_storage_test_expected_pv_count": null, "_storage_test_expected_pv_type": null, "_storage_test_pool_pvs": [], "_storage_test_pool_pvs_lvm": [] }, "changed": false } TASK [Verify the volumes] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-pool-volumes.yml:3 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.066) 0:08:57.135 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 20:59:02 -0400 (0:00:00.121) 0:08:57.257 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": true, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.074) 0:08:57.332 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.440) 0:08:57.773 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.108) 0:08:57.882 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "/opt/test1", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.087) 0:08:57.969 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.061) 0:08:58.031 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.085) 0:08:58.117 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.063) 0:08:58.180 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 20:59:03 -0400 (0:00:00.076) 0:08:58.257 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.054) 0:08:58.311 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.064) 0:08:58.376 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.061) 0:08:58.437 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.067) 0:08:58.505 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.050) 0:08:58.555 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "1", "storage_test_fstab_expected_mount_options_matches": "1", "storage_test_fstab_expected_mount_point_matches": "1", "storage_test_fstab_id_matches": [ "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6 " ], "storage_test_fstab_mount_options_matches": [ " /opt/test1 xfs defaults " ], "storage_test_fstab_mount_point_matches": [ " /opt/test1 " ] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.104) 0:08:58.660 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.069) 0:08:58.729 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.074) 0:08:58.803 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.166) 0:08:58.970 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.062) 0:08:59.033 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.055) 0:08:59.089 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 20:59:04 -0400 (0:00:00.089) 0:08:59.178 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 20:59:05 -0400 (0:00:00.170) 0:08:59.349 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781131.0049512, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728781131.0049512, "dev": 5, "device_type": 64768, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 385962, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728781131.0049512, "nlink": 1, "path": "/dev/mapper/foo-test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 20:59:05 -0400 (0:00:00.688) 0:09:00.037 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 20:59:05 -0400 (0:00:00.122) 0:09:00.159 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 20:59:05 -0400 (0:00:00.078) 0:09:00.237 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 20:59:06 -0400 (0:00:00.111) 0:09:00.348 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "lvm" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 20:59:06 -0400 (0:00:00.118) 0:09:00.467 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 20:59:06 -0400 (0:00:00.079) 0:09:00.547 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 20:59:06 -0400 (0:00:00.073) 0:09:00.620 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781131.1139517, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728781131.1139517, "dev": 5, "device_type": 64769, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 397242, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/symlink", "mode": "0660", "mtime": 1728781131.1139517, "nlink": 1, "path": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 20:59:06 -0400 (0:00:00.476) 0:09:01.097 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 20:59:07 -0400 (0:00:00.729) 0:09:01.827 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cryptsetup", "luksDump", "/dev/mapper/foo-test1" ], "delta": "0:00:00.025059", "end": "2024-10-12 20:59:07.927803", "rc": 0, "start": "2024-10-12 20:59:07.902744" } STDOUT: LUKS header information for /dev/mapper/foo-test1 Version: 1 Cipher name: aes Cipher mode: xts-plain64 Hash spec: sha256 Payload offset: 8192 MK bits: 512 MK digest: 1f 8b cc a9 68 1d 04 f8 18 4d 0a ff 96 f1 6a 48 ac b5 c8 b2 MK salt: 2d ff 9b 0e 18 c8 43 24 b6 ef 21 cb d1 7c 41 bd 1c ac 15 8b f7 8d 49 54 93 e7 12 cb 0f bf ab 76 MK iterations: 23848 UUID: d54b19e2-d700-4352-9ef0-ca3f826fceb6 Key Slot 0: ENABLED Iterations: 381576 Salt: b5 e8 bf ab 26 92 70 c6 94 fc a1 93 45 6d 97 5f 9b a3 bd 64 bf 03 34 51 27 d0 41 20 bd b0 43 17 Key material offset: 8 AF stripes: 4000 Key Slot 1: DISABLED Key Slot 2: DISABLED Key Slot 3: DISABLED Key Slot 4: DISABLED Key Slot 5: DISABLED Key Slot 6: DISABLED Key Slot 7: DISABLED TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 20:59:07 -0400 (0:00:00.466) 0:09:02.294 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.056) 0:09:02.351 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.062) 0:09:02.413 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.067) 0:09:02.480 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.052) 0:09:02.532 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.041) 0:09:02.574 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.043) 0:09:02.617 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.058) 0:09:02.676 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [ "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6 /dev/mapper/foo-test1 -" ], "_storage_test_expected_crypttab_entries": "1", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.077) 0:09:02.754 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.074) 0:09:02.828 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.068) 0:09:02.896 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.069) 0:09:02.966 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.099) 0:09:03.066 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.076) 0:09:03.143 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.078) 0:09:03.221 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 20:59:08 -0400 (0:00:00.074) 0:09:03.296 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.053) 0:09:03.349 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.064) 0:09:03.414 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.061) 0:09:03.475 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.105) 0:09:03.581 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.052) 0:09:03.634 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.051) 0:09:03.685 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.051) 0:09:03.736 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.058) 0:09:03.795 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 20:59:09 -0400 (0:00:00.467) 0:09:04.262 ****** ok: [managed-node2] => { "bytes": 4294967296, "changed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 20:59:10 -0400 (0:00:00.372) 0:09:04.635 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_expected_size": "4294967296" }, "changed": false } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 20:59:10 -0400 (0:00:00.082) 0:09:04.717 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 20:59:10 -0400 (0:00:00.056) 0:09:04.774 ****** ok: [managed-node2] => { "bytes": 10737418240, "changed": false, "lvm": "10g", "parted": "10GiB", "size": "10 GiB" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 20:59:10 -0400 (0:00:00.340) 0:09:05.115 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 20:59:10 -0400 (0:00:00.059) 0:09:05.174 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 20:59:10 -0400 (0:00:00.062) 0:09:05.236 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.067) 0:09:05.304 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.067) 0:09:05.371 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.053) 0:09:05.425 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.055) 0:09:05.481 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.069) 0:09:05.551 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.113) 0:09:05.664 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.059) 0:09:05.723 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.074) 0:09:05.798 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.069) 0:09:05.867 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.063) 0:09:05.931 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.076) 0:09:06.008 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.093) 0:09:06.102 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.109) 0:09:06.211 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 20:59:11 -0400 (0:00:00.070) 0:09:06.282 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.083) 0:09:06.365 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.114) 0:09:06.480 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.073) 0:09:06.556 ****** ok: [managed-node2] => { "storage_test_actual_size": { "bytes": 4294967296, "changed": false, "failed": false, "lvm": "4g", "parted": "4GiB", "size": "4 GiB" } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.065) 0:09:06.622 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.067) 0:09:06.689 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.086) 0:09:06.775 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "lvs", "--noheadings", "--nameprefixes", "--units=b", "--nosuffix", "--unquoted", "-o", "name,attr,cache_total_blocks,chunk_size,segtype", "foo/test1" ], "delta": "0:00:00.020721", "end": "2024-10-12 20:59:12.845640", "rc": 0, "start": "2024-10-12 20:59:12.824919" } STDOUT: LVM2_LV_NAME=test1 LVM2_LV_ATTR=-wi-ao---- LVM2_CACHE_TOTAL_BLOCKS= LVM2_CHUNK_SIZE=0 LVM2_SEGTYPE=linear TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 20:59:12 -0400 (0:00:00.460) 0:09:07.236 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_lv_segtype": [ "linear" ] }, "changed": false } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.068) 0:09:07.304 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.075) 0:09:07.380 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.060) 0:09:07.440 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.056) 0:09:07.497 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.052) 0:09:07.549 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.036) 0:09:07.586 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.035) 0:09:07.621 ****** TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.134) 0:09:07.756 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } TASK [Clean up] **************************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:491 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.073) 0:09:07.829 ****** TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:2 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.231) 0:09:08.061 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Ensure ansible_facts used by role] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:2 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.086) 0:09:08.147 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set platform/version specific variables] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:7 Saturday 12 October 2024 20:59:13 -0400 (0:00:00.077) 0:09:08.225 ****** skipping: [managed-node2] => (item=RedHat.yml) => { "ansible_loop_var": "item", "changed": false, "item": "RedHat.yml", "skip_reason": "Conditional result was False" } skipping: [managed-node2] => (item=CentOS.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS.yml", "skip_reason": "Conditional result was False" } ok: [managed-node2] => (item=CentOS_7.yml) => { "ansible_facts": { "__storage_blivet_diskvolume_mkfs_option_map": { "ext2": "-F", "ext3": "-F", "ext4": "-F" }, "blivet_package_list": [ "python-enum34", "python-blivet3", "libblockdev-crypto", "libblockdev-dm", "libblockdev-lvm", "libblockdev-mdraid", "libblockdev-swap", "{{ 'libblockdev-s390' if ansible_architecture == 's390x' else 'libblockdev' }}" ] }, "ansible_included_var_files": [ "/tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/vars/CentOS_7.yml" ], "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.yml" } skipping: [managed-node2] => (item=CentOS_7.9.yml) => { "ansible_loop_var": "item", "changed": false, "item": "CentOS_7.9.yml", "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if system is ostree] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:25 Saturday 12 October 2024 20:59:14 -0400 (0:00:00.139) 0:09:08.365 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Set flag to indicate system is ostree] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/set_vars.yml:30 Saturday 12 October 2024 20:59:14 -0400 (0:00:00.087) 0:09:08.453 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Define an empty list of pools to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:5 Saturday 12 October 2024 20:59:14 -0400 (0:00:00.110) 0:09:08.563 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Define an empty list of volumes to be used in testing] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:9 Saturday 12 October 2024 20:59:14 -0400 (0:00:00.088) 0:09:08.651 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Include the appropriate provider tasks] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main.yml:13 Saturday 12 October 2024 20:59:14 -0400 (0:00:00.099) 0:09:08.751 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Make sure blivet is available] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 Saturday 12 October 2024 20:59:14 -0400 (0:00:00.206) 0:09:08.958 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "python-enum34-1.0.4-1.el7.noarch providing python-enum34 is already installed", "1:python2-blivet3-3.1.3-3.el7.noarch providing python-blivet3 is already installed", "libblockdev-crypto-2.18-5.el7.x86_64 providing libblockdev-crypto is already installed", "libblockdev-dm-2.18-5.el7.x86_64 providing libblockdev-dm is already installed", "libblockdev-lvm-2.18-5.el7.x86_64 providing libblockdev-lvm is already installed", "libblockdev-mdraid-2.18-5.el7.x86_64 providing libblockdev-mdraid is already installed", "libblockdev-swap-2.18-5.el7.x86_64 providing libblockdev-swap is already installed", "libblockdev-2.18-5.el7.x86_64 providing libblockdev is already installed" ] } lsrpackages: libblockdev libblockdev-crypto libblockdev-dm libblockdev-lvm libblockdev-mdraid libblockdev-swap python-blivet3 python-enum34 TASK [fedora.linux_system_roles.storage : Show storage_pools] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:9 Saturday 12 October 2024 20:59:16 -0400 (0:00:01.416) 0:09:10.375 ****** ok: [managed-node2] => { "storage_pools": "VARIABLE IS NOT DEFINED!: 'storage_pools' is undefined" } TASK [fedora.linux_system_roles.storage : Show storage_volumes] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:14 Saturday 12 October 2024 20:59:16 -0400 (0:00:00.038) 0:09:10.413 ****** ok: [managed-node2] => { "storage_volumes": [ { "disks": [ "sda" ], "name": "foo", "state": "absent", "type": "disk" } ] } TASK [fedora.linux_system_roles.storage : Get required packages] *************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 Saturday 12 October 2024 20:59:16 -0400 (0:00:00.041) 0:09:10.454 ****** ok: [managed-node2] => { "actions": [], "changed": false, "crypts": [], "leaves": [], "mounts": [], "packages": [], "pools": [], "volumes": [] } TASK [fedora.linux_system_roles.storage : Enable copr repositories if needed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:31 Saturday 12 October 2024 20:59:20 -0400 (0:00:03.982) 0:09:14.437 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml for managed-node2 TASK [fedora.linux_system_roles.storage : Check if the COPR support packages should be installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:2 Saturday 12 October 2024 20:59:20 -0400 (0:00:00.099) 0:09:14.536 ****** TASK [fedora.linux_system_roles.storage : Make sure COPR support packages are present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:13 Saturday 12 October 2024 20:59:20 -0400 (0:00:00.054) 0:09:14.591 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Enable COPRs] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/enable_coprs.yml:19 Saturday 12 October 2024 20:59:20 -0400 (0:00:00.056) 0:09:14.647 ****** TASK [fedora.linux_system_roles.storage : Make sure required packages are installed] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:37 Saturday 12 October 2024 20:59:20 -0400 (0:00:00.048) 0:09:14.695 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "kpartx-0.4.9-136.el7_9.x86_64 providing kpartx is already installed" ] } lsrpackages: kpartx TASK [fedora.linux_system_roles.storage : Get service facts] ******************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:51 Saturday 12 October 2024 20:59:21 -0400 (0:00:00.912) 0:09:15.608 ****** ok: [managed-node2] => { "ansible_facts": { "services": { "NetworkManager-dispatcher.service": { "name": "NetworkManager-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "NetworkManager-wait-online.service": { "name": "NetworkManager-wait-online.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "NetworkManager.service": { "name": "NetworkManager.service", "source": "systemd", "state": "running", "status": "enabled" }, "arp-ethers.service": { "name": "arp-ethers.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "auditd.service": { "name": "auditd.service", "source": "systemd", "state": "running", "status": "enabled" }, "auth-rpcgss-module.service": { "name": "auth-rpcgss-module.service", "source": "systemd", "state": "stopped", "status": "static" }, "autovt@.service": { "name": "autovt@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "blivet.service": { "name": "blivet.service", "source": "systemd", "state": "inactive", "status": "static" }, "blk-availability.service": { "name": "blk-availability.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "brandbot.service": { "name": "brandbot.service", "source": "systemd", "state": "inactive", "status": "static" }, "chrony-dnssrv@.service": { "name": "chrony-dnssrv@.service", "source": "systemd", "state": "unknown", "status": "static" }, "chrony-wait.service": { "name": "chrony-wait.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "chronyd.service": { "name": "chronyd.service", "source": "systemd", "state": "running", "status": "enabled" }, "cloud-config.service": { "name": "cloud-config.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-final.service": { "name": "cloud-final.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init-local.service": { "name": "cloud-init-local.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "cloud-init.service": { "name": "cloud-init.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "console-getty.service": { "name": "console-getty.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "console-shell.service": { "name": "console-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "container-getty@.service": { "name": "container-getty@.service", "source": "systemd", "state": "unknown", "status": "static" }, "cpupower.service": { "name": "cpupower.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "crond.service": { "name": "crond.service", "source": "systemd", "state": "running", "status": "enabled" }, "dbus-org.freedesktop.hostname1.service": { "name": "dbus-org.freedesktop.hostname1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.import1.service": { "name": "dbus-org.freedesktop.import1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.locale1.service": { "name": "dbus-org.freedesktop.locale1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.login1.service": { "name": "dbus-org.freedesktop.login1.service", "source": "systemd", "state": "active", "status": "static" }, "dbus-org.freedesktop.machine1.service": { "name": "dbus-org.freedesktop.machine1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus-org.freedesktop.nm-dispatcher.service": { "name": "dbus-org.freedesktop.nm-dispatcher.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "dbus-org.freedesktop.timedate1.service": { "name": "dbus-org.freedesktop.timedate1.service", "source": "systemd", "state": "inactive", "status": "static" }, "dbus.service": { "name": "dbus.service", "source": "systemd", "state": "running", "status": "static" }, "debug-shell.service": { "name": "debug-shell.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "dm-event.service": { "name": "dm-event.service", "source": "systemd", "state": "stopped", "status": "static" }, "dmraid-activation.service": { "name": "dmraid-activation.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "dracut-cmdline.service": { "name": "dracut-cmdline.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-initqueue.service": { "name": "dracut-initqueue.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-mount.service": { "name": "dracut-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-mount.service": { "name": "dracut-pre-mount.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-pivot.service": { "name": "dracut-pre-pivot.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-trigger.service": { "name": "dracut-pre-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-pre-udev.service": { "name": "dracut-pre-udev.service", "source": "systemd", "state": "stopped", "status": "static" }, "dracut-shutdown.service": { "name": "dracut-shutdown.service", "source": "systemd", "state": "stopped", "status": "static" }, "ebtables.service": { "name": "ebtables.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "emergency.service": { "name": "emergency.service", "source": "systemd", "state": "stopped", "status": "static" }, "firewalld.service": { "name": "firewalld.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "fstrim.service": { "name": "fstrim.service", "source": "systemd", "state": "inactive", "status": "static" }, "getty@.service": { "name": "getty@.service", "source": "systemd", "state": "unknown", "status": "enabled" }, "getty@tty1.service": { "name": "getty@tty1.service", "source": "systemd", "state": "running", "status": "unknown" }, "gssproxy.service": { "name": "gssproxy.service", "source": "systemd", "state": "running", "status": "disabled" }, "halt-local.service": { "name": "halt-local.service", "source": "systemd", "state": "inactive", "status": "static" }, "initrd-cleanup.service": { "name": "initrd-cleanup.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-parse-etc.service": { "name": "initrd-parse-etc.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-switch-root.service": { "name": "initrd-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "initrd-udevadm-cleanup-db.service": { "name": "initrd-udevadm-cleanup-db.service", "source": "systemd", "state": "stopped", "status": "static" }, "iprdump.service": { "name": "iprdump.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprinit.service": { "name": "iprinit.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "iprupdate.service": { "name": "iprupdate.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "irqbalance.service": { "name": "irqbalance.service", "source": "systemd", "state": "running", "status": "enabled" }, "kdump.service": { "name": "kdump.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "kmod-static-nodes.service": { "name": "kmod-static-nodes.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-lvmetad.service": { "name": "lvm2-lvmetad.service", "source": "systemd", "state": "running", "status": "static" }, "lvm2-lvmpolld.service": { "name": "lvm2-lvmpolld.service", "source": "systemd", "state": "stopped", "status": "static" }, "lvm2-monitor.service": { "name": "lvm2-monitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "lvm2-pvscan@.service": { "name": "lvm2-pvscan@.service", "source": "systemd", "state": "unknown", "status": "static" }, "lvm2-pvscan@8:0.service": { "name": "lvm2-pvscan@8:0.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "mdadm-grow-continue@.service": { "name": "mdadm-grow-continue@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdadm-last-resort@.service": { "name": "mdadm-last-resort@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdcheck_continue.service": { "name": "mdcheck_continue.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdcheck_start.service": { "name": "mdcheck_start.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmon@.service": { "name": "mdmon@.service", "source": "systemd", "state": "unknown", "status": "static" }, "mdmonitor-oneshot.service": { "name": "mdmonitor-oneshot.service", "source": "systemd", "state": "inactive", "status": "static" }, "mdmonitor.service": { "name": "mdmonitor.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "messagebus.service": { "name": "messagebus.service", "source": "systemd", "state": "active", "status": "static" }, "microcode.service": { "name": "microcode.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "netconsole": { "name": "netconsole", "source": "sysv", "state": "stopped", "status": "disabled" }, "network": { "name": "network", "source": "sysv", "state": "running", "status": "enabled" }, "network.service": { "name": "network.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "nfs-blkmap.service": { "name": "nfs-blkmap.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-config.service": { "name": "nfs-config.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-idmap.service": { "name": "nfs-idmap.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-idmapd.service": { "name": "nfs-idmapd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-lock.service": { "name": "nfs-lock.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-mountd.service": { "name": "nfs-mountd.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs-rquotad.service": { "name": "nfs-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfs-secure.service": { "name": "nfs-secure.service", "source": "systemd", "state": "inactive", "status": "static" }, "nfs-server.service": { "name": "nfs-server.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "nfs-utils.service": { "name": "nfs-utils.service", "source": "systemd", "state": "stopped", "status": "static" }, "nfs.service": { "name": "nfs.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "nfslock.service": { "name": "nfslock.service", "source": "systemd", "state": "inactive", "status": "static" }, "plymouth-halt.service": { "name": "plymouth-halt.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-kexec.service": { "name": "plymouth-kexec.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-poweroff.service": { "name": "plymouth-poweroff.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-quit-wait.service": { "name": "plymouth-quit-wait.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-quit.service": { "name": "plymouth-quit.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-read-write.service": { "name": "plymouth-read-write.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-reboot.service": { "name": "plymouth-reboot.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "plymouth-start.service": { "name": "plymouth-start.service", "source": "systemd", "state": "stopped", "status": "disabled" }, "plymouth-switch-root.service": { "name": "plymouth-switch-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "polkit.service": { "name": "polkit.service", "source": "systemd", "state": "running", "status": "static" }, "postfix.service": { "name": "postfix.service", "source": "systemd", "state": "running", "status": "enabled" }, "qemu-guest-agent.service": { "name": "qemu-guest-agent.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "quotaon.service": { "name": "quotaon.service", "source": "systemd", "state": "inactive", "status": "static" }, "rc-local.service": { "name": "rc-local.service", "source": "systemd", "state": "stopped", "status": "static" }, "rdisc.service": { "name": "rdisc.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rescue.service": { "name": "rescue.service", "source": "systemd", "state": "stopped", "status": "static" }, "restraintd.service": { "name": "restraintd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rhel-autorelabel-mark.service": { "name": "rhel-autorelabel-mark.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-autorelabel.service": { "name": "rhel-autorelabel.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-configure.service": { "name": "rhel-configure.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-dmesg.service": { "name": "rhel-dmesg.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-domainname.service": { "name": "rhel-domainname.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-import-state.service": { "name": "rhel-import-state.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-loadmodules.service": { "name": "rhel-loadmodules.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rhel-readonly.service": { "name": "rhel-readonly.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "rngd.service": { "name": "rngd.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpc-gssd.service": { "name": "rpc-gssd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-rquotad.service": { "name": "rpc-rquotad.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rpc-statd-notify.service": { "name": "rpc-statd-notify.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpc-statd.service": { "name": "rpc-statd.service", "source": "systemd", "state": "stopped", "status": "static" }, "rpcbind.service": { "name": "rpcbind.service", "source": "systemd", "state": "running", "status": "enabled" }, "rpcgssd.service": { "name": "rpcgssd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rpcidmapd.service": { "name": "rpcidmapd.service", "source": "systemd", "state": "inactive", "status": "static" }, "rsyncd.service": { "name": "rsyncd.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "rsyncd@.service": { "name": "rsyncd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "rsyslog.service": { "name": "rsyslog.service", "source": "systemd", "state": "running", "status": "enabled" }, "selinux-policy-migrate-local-changes@.service": { "name": "selinux-policy-migrate-local-changes@.service", "source": "systemd", "state": "unknown", "status": "static" }, "selinux-policy-migrate-local-changes@targeted.service": { "name": "selinux-policy-migrate-local-changes@targeted.service", "source": "systemd", "state": "stopped", "status": "unknown" }, "serial-getty@.service": { "name": "serial-getty@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "serial-getty@ttyS0.service": { "name": "serial-getty@ttyS0.service", "source": "systemd", "state": "running", "status": "unknown" }, "sshd-keygen.service": { "name": "sshd-keygen.service", "source": "systemd", "state": "stopped", "status": "static" }, "sshd.service": { "name": "sshd.service", "source": "systemd", "state": "running", "status": "enabled" }, "sshd@.service": { "name": "sshd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-ask-password-console.service": { "name": "systemd-ask-password-console.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-plymouth.service": { "name": "systemd-ask-password-plymouth.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-ask-password-wall.service": { "name": "systemd-ask-password-wall.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-backlight@.service": { "name": "systemd-backlight@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-binfmt.service": { "name": "systemd-binfmt.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-bootchart.service": { "name": "systemd-bootchart.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "systemd-firstboot.service": { "name": "systemd-firstboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck-root.service": { "name": "systemd-fsck-root.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-fsck@.service": { "name": "systemd-fsck@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-halt.service": { "name": "systemd-halt.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hibernate-resume@.service": { "name": "systemd-hibernate-resume@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-hibernate.service": { "name": "systemd-hibernate.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hostnamed.service": { "name": "systemd-hostnamed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-hwdb-update.service": { "name": "systemd-hwdb-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-hybrid-sleep.service": { "name": "systemd-hybrid-sleep.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-importd.service": { "name": "systemd-importd.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-initctl.service": { "name": "systemd-initctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-catalog-update.service": { "name": "systemd-journal-catalog-update.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journal-flush.service": { "name": "systemd-journal-flush.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-journald.service": { "name": "systemd-journald.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-kexec.service": { "name": "systemd-kexec.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-localed.service": { "name": "systemd-localed.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-logind.service": { "name": "systemd-logind.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-machine-id-commit.service": { "name": "systemd-machine-id-commit.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-machined.service": { "name": "systemd-machined.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-modules-load.service": { "name": "systemd-modules-load.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-nspawn@.service": { "name": "systemd-nspawn@.service", "source": "systemd", "state": "unknown", "status": "disabled" }, "systemd-poweroff.service": { "name": "systemd-poweroff.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-quotacheck.service": { "name": "systemd-quotacheck.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-random-seed.service": { "name": "systemd-random-seed.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-readahead-collect.service": { "name": "systemd-readahead-collect.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-readahead-done.service": { "name": "systemd-readahead-done.service", "source": "systemd", "state": "stopped", "status": "indirect" }, "systemd-readahead-drop.service": { "name": "systemd-readahead-drop.service", "source": "systemd", "state": "inactive", "status": "enabled" }, "systemd-readahead-replay.service": { "name": "systemd-readahead-replay.service", "source": "systemd", "state": "stopped", "status": "enabled" }, "systemd-reboot.service": { "name": "systemd-reboot.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-remount-fs.service": { "name": "systemd-remount-fs.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-rfkill@.service": { "name": "systemd-rfkill@.service", "source": "systemd", "state": "unknown", "status": "static" }, "systemd-shutdownd.service": { "name": "systemd-shutdownd.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-suspend.service": { "name": "systemd-suspend.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-sysctl.service": { "name": "systemd-sysctl.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-timedated.service": { "name": "systemd-timedated.service", "source": "systemd", "state": "inactive", "status": "static" }, "systemd-tmpfiles-clean.service": { "name": "systemd-tmpfiles-clean.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup-dev.service": { "name": "systemd-tmpfiles-setup-dev.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-tmpfiles-setup.service": { "name": "systemd-tmpfiles-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-settle.service": { "name": "systemd-udev-settle.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udev-trigger.service": { "name": "systemd-udev-trigger.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-udevd.service": { "name": "systemd-udevd.service", "source": "systemd", "state": "running", "status": "static" }, "systemd-update-done.service": { "name": "systemd-update-done.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp-runlevel.service": { "name": "systemd-update-utmp-runlevel.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-update-utmp.service": { "name": "systemd-update-utmp.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-user-sessions.service": { "name": "systemd-user-sessions.service", "source": "systemd", "state": "stopped", "status": "static" }, "systemd-vconsole-setup.service": { "name": "systemd-vconsole-setup.service", "source": "systemd", "state": "stopped", "status": "static" }, "target.service": { "name": "target.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "targetclid.service": { "name": "targetclid.service", "source": "systemd", "state": "inactive", "status": "disabled" }, "teamd@.service": { "name": "teamd@.service", "source": "systemd", "state": "unknown", "status": "static" }, "tuned.service": { "name": "tuned.service", "source": "systemd", "state": "running", "status": "enabled" }, "wpa_supplicant.service": { "name": "wpa_supplicant.service", "source": "systemd", "state": "inactive", "status": "disabled" } } }, "changed": false } TASK [fedora.linux_system_roles.storage : Set storage_cryptsetup_services] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:57 Saturday 12 October 2024 20:59:22 -0400 (0:00:01.058) 0:09:16.666 ****** ok: [managed-node2] => { "ansible_facts": { "storage_cryptsetup_services": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Mask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:63 Saturday 12 October 2024 20:59:22 -0400 (0:00:00.116) 0:09:16.782 ****** TASK [fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 Saturday 12 October 2024 20:59:22 -0400 (0:00:00.063) 0:09:16.846 ****** changed: [managed-node2] => { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "password": "-", "state": "absent" } ], "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [fedora.linux_system_roles.storage : Workaround for udev issue on some platforms] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:83 Saturday 12 October 2024 20:59:57 -0400 (0:00:34.950) 0:09:51.797 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [fedora.linux_system_roles.storage : Check if /etc/fstab is present] ****** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:90 Saturday 12 October 2024 20:59:57 -0400 (0:00:00.070) 0:09:51.867 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781133.8449647, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "faf50819b800ecfe22be8d19ce2b81e88bb7de9e", "ctime": 1728781133.8419647, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 263584, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0644", "mtime": 1728781133.8419647, "nlink": 1, "path": "/etc/fstab", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 1311, "uid": 0, "version": "1741548960", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Add fingerprint to /etc/fstab if present] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:95 Saturday 12 October 2024 20:59:58 -0400 (0:00:00.577) 0:09:52.445 ****** ok: [managed-node2] => { "backup": "", "changed": false } TASK [fedora.linux_system_roles.storage : Unmask the systemd cryptsetup services] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:113 Saturday 12 October 2024 20:59:58 -0400 (0:00:00.610) 0:09:53.056 ****** TASK [fedora.linux_system_roles.storage : Show blivet_output] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:119 Saturday 12 October 2024 20:59:58 -0400 (0:00:00.082) 0:09:53.139 ****** ok: [managed-node2] => { "blivet_output": { "actions": [ { "action": "destroy format", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": "xfs" }, { "action": "destroy device", "device": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "fs_type": null }, { "action": "destroy format", "device": "/dev/mapper/foo-test1", "fs_type": "luks" }, { "action": "destroy device", "device": "/dev/mapper/foo-test1", "fs_type": null }, { "action": "destroy device", "device": "/dev/foo", "fs_type": null }, { "action": "destroy format", "device": "/dev/sda", "fs_type": "lvmpv" } ], "changed": true, "crypts": [ { "backing_device": "/dev/mapper/foo-test1", "name": "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "password": "-", "state": "absent" } ], "failed": false, "leaves": [ "/dev/sda", "/dev/sdb", "/dev/sdc", "/dev/sdd", "/dev/sde", "/dev/sdf", "/dev/sdg", "/dev/sdh", "/dev/sdi", "/dev/xvda1" ], "mounts": [ { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "absent" } ], "packages": [ "e2fsprogs" ], "pools": [], "volumes": [ { "_device": "/dev/sda", "_mount_id": "UUID=nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } } TASK [fedora.linux_system_roles.storage : Set the list of pools for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:128 Saturday 12 October 2024 20:59:58 -0400 (0:00:00.138) 0:09:53.278 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_pools_list": [] }, "changed": false } TASK [fedora.linux_system_roles.storage : Set the list of volumes for test verification] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:132 Saturday 12 October 2024 20:59:59 -0400 (0:00:00.114) 0:09:53.392 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] }, "changed": false } TASK [fedora.linux_system_roles.storage : Remove obsolete mounts] ************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:148 Saturday 12 October 2024 20:59:59 -0400 (0:00:00.102) 0:09:53.495 ****** changed: [managed-node2] => (item={u'src': u'/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6', u'state': u'absent', u'fstype': u'xfs', u'path': u'/opt/test1'}) => { "ansible_loop_var": "mount_info", "changed": true, "dump": "0", "fstab": "/etc/fstab", "fstype": "xfs", "mount_info": { "fstype": "xfs", "path": "/opt/test1", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "state": "absent" }, "name": "/opt/test1", "opts": "defaults", "passno": "0", "src": "/dev/mapper/luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6" } TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:159 Saturday 12 October 2024 20:59:59 -0400 (0:00:00.487) 0:09:53.982 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Set up new/current mounts] *********** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:164 Saturday 12 October 2024 21:00:00 -0400 (0:00:00.728) 0:09:54.710 ****** TASK [fedora.linux_system_roles.storage : Manage mount ownership/permissions] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:175 Saturday 12 October 2024 21:00:00 -0400 (0:00:00.054) 0:09:54.765 ****** TASK [fedora.linux_system_roles.storage : Tell systemd to refresh its view of /etc/fstab] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:187 Saturday 12 October 2024 21:00:00 -0400 (0:00:00.052) 0:09:54.817 ****** ok: [managed-node2] => { "changed": false, "name": null, "status": {} } TASK [fedora.linux_system_roles.storage : Retrieve facts for the /etc/crypttab file] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:195 Saturday 12 October 2024 21:00:01 -0400 (0:00:00.513) 0:09:55.331 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781137.3479815, "attr_flags": "e", "attributes": [ "extents" ], "block_size": 4096, "blocks": 8, "charset": "us-ascii", "checksum": "58af6f536a2665a911ea5036b50f877287d2f399", "ctime": 1728781135.1709712, "dev": 51713, "device_type": 0, "executable": false, "exists": true, "gid": 0, "gr_name": "root", "inode": 917511, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": true, "issock": false, "isuid": false, "mimetype": "text/plain", "mode": "0600", "mtime": 1728781135.1699712, "nlink": 1, "path": "/etc/crypttab", "pw_name": "root", "readable": true, "rgrp": false, "roth": false, "rusr": true, "size": 66, "uid": 0, "version": "1741555431", "wgrp": false, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [fedora.linux_system_roles.storage : Manage /etc/crypttab to account for changes we just made] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:200 Saturday 12 October 2024 21:00:01 -0400 (0:00:00.368) 0:09:55.699 ****** changed: [managed-node2] => (item={u'state': u'absent', u'password': u'-', u'name': u'luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6', u'backing_device': u'/dev/mapper/foo-test1'}) => { "ansible_loop_var": "entry", "backup": "", "changed": true, "entry": { "backing_device": "/dev/mapper/foo-test1", "name": "luks-d54b19e2-d700-4352-9ef0-ca3f826fceb6", "password": "-", "state": "absent" }, "found": 1 } MSG: 1 line(s) removed TASK [fedora.linux_system_roles.storage : Update facts] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:222 Saturday 12 October 2024 21:00:01 -0400 (0:00:00.392) 0:09:56.092 ****** ok: [managed-node2] TASK [Verify role results] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/tests_luks.yml:501 Saturday 12 October 2024 21:00:02 -0400 (0:00:00.731) 0:09:56.824 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml for managed-node2 TASK [Print out pool information] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:2 Saturday 12 October 2024 21:00:02 -0400 (0:00:00.133) 0:09:56.957 ****** skipping: [managed-node2] => {} TASK [Print out volume information] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:7 Saturday 12 October 2024 21:00:02 -0400 (0:00:00.051) 0:09:57.009 ****** ok: [managed-node2] => { "_storage_volumes_list": [ { "_device": "/dev/sda", "_mount_id": "UUID=nXc7RT-HEZY-OYjB-gzgg-aPxl-xWkJ-gSSNJT", "_raw_device": "/dev/sda", "cache_devices": [], "cache_mode": null, "cache_size": 0, "cached": false, "compression": null, "deduplication": null, "disks": [ "sda" ], "encryption": false, "encryption_cipher": null, "encryption_key": null, "encryption_key_size": null, "encryption_luks_version": null, "encryption_password": null, "fs_create_options": "", "fs_label": "", "fs_overwrite_existing": true, "fs_type": "lvmpv", "mount_check": 0, "mount_device_identifier": "uuid", "mount_group": null, "mount_mode": null, "mount_options": "defaults", "mount_passno": 0, "mount_point": null, "mount_user": null, "name": "foo", "raid_chunk_size": null, "raid_device_count": null, "raid_level": null, "raid_metadata_version": null, "raid_spare_count": null, "raid_stripe_size": null, "size": 10737418240, "state": "absent", "thin": null, "thin_pool_name": null, "thin_pool_size": null, "type": "disk", "vdo_pool_size": null } ] } TASK [Collect info about the volumes.] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:15 Saturday 12 October 2024 21:00:02 -0400 (0:00:00.067) 0:09:57.077 ****** ok: [managed-node2] => { "changed": false, "info": { "/dev/sda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sda", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdb": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdb", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdc": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdc", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdd": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdd", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sde": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sde", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdf": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdf", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdg": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdg", "size": "1T", "type": "disk", "uuid": "" }, "/dev/sdh": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdh", "size": "10G", "type": "disk", "uuid": "" }, "/dev/sdi": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/sdi", "size": "10G", "type": "disk", "uuid": "" }, "/dev/xvda": { "fstype": "", "label": "", "mountpoint": "", "name": "/dev/xvda", "size": "250G", "type": "disk", "uuid": "" }, "/dev/xvda1": { "fstype": "ext4", "label": "", "mountpoint": "/", "name": "/dev/xvda1", "size": "250G", "type": "partition", "uuid": "c7b7d6a5-fd01-4b9b-bcca-153eaff9d312" } } } TASK [Read the /etc/fstab file for volume existence] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:20 Saturday 12 October 2024 21:00:03 -0400 (0:00:00.385) 0:09:57.462 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/fstab" ], "delta": "0:00:00.002485", "end": "2024-10-12 21:00:03.444271", "rc": 0, "start": "2024-10-12 21:00:03.441786" } STDOUT: # system_role:storage # # /etc/fstab # Created by anaconda on Thu Jun 20 10:23:46 2024 # # Accessible filesystems, by reference, are maintained under '/dev/disk' # See man pages fstab(5), findfs(8), mount(8) and/or blkid(8) for more info # UUID=c7b7d6a5-fd01-4b9b-bcca-153eaff9d312 / ext4 defaults 1 1 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat /mnt/redhat nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/qa /mnt/qa nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 vtap-eng01.storage.rdu2.redhat.com:/vol/engarchive /mnt/engarchive nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 nest.test.redhat.com:/mnt/tpsdist /mnt/tpsdist nfs defaults,rsize=8192,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_engineering_nfs/devarchive/redhat/brewroot /mnt/brew nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 ntap-bos-c01-eng01-nfs01b.storage.bos.redhat.com:/devops_brew_scratch_nfs/scratch /mnt/brew_scratch nfs ro,rsize=32768,wsize=8192,bg,noauto,noatime,nosuid,nodev,intr 0 0 TASK [Read the /etc/crypttab file] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:25 Saturday 12 October 2024 21:00:03 -0400 (0:00:00.348) 0:09:57.811 ****** ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/etc/crypttab" ], "delta": "0:00:00.002490", "end": "2024-10-12 21:00:03.751436", "failed_when_result": false, "rc": 0, "start": "2024-10-12 21:00:03.748946" } TASK [Verify the volumes listed in storage_pools were correctly managed] ******* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:34 Saturday 12 October 2024 21:00:03 -0400 (0:00:00.325) 0:09:58.136 ****** TASK [Verify the volumes with no pool were correctly managed] ****************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:44 Saturday 12 October 2024 21:00:03 -0400 (0:00:00.059) 0:09:58.196 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml for managed-node2 TASK [Set storage volume test variables] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:2 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.112) 0:09:58.308 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": false, "_storage_volume_tests": [ "mount", "fstab", "fs", "device", "encryption", "md", "size", "cache" ] }, "changed": false } TASK [Run test verify for {{ storage_test_volume_subset }}] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:19 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.066) 0:09:58.374 ****** included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml for managed-node2 included: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml for managed-node2 TASK [Get expected mount device based on device type] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:7 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.248) 0:09:58.622 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_device_path": "/dev/sda" }, "changed": false } TASK [Set some facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:11 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.059) 0:09:58.683 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_mount_expected_mount_point": "", "storage_test_swap_expected_matches": "0" }, "changed": false } TASK [Get information about the mountpoint directory] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:19 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.066) 0:09:58.749 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the current mount state by device] ******************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:28 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.054) 0:09:58.803 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory user] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:36 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.047) 0:09:58.851 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory group] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:42 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.056) 0:09:58.907 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify mount directory permissions] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:48 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.052) 0:09:58.959 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get path of test volume device] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:57 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.047) 0:09:59.007 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Gather swap info] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:63 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.052) 0:09:59.059 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify swap status] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:69 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.051) 0:09:59.111 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Unset facts] ************************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-mount.yml:79 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.052) 0:09:59.163 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_found_mount_stat": null, "storage_test_mount_expected_mount_point": null, "storage_test_swap_expected_matches": null, "storage_test_swaps": null, "storage_test_sys_node": null }, "changed": false } TASK [Set some variables for fstab checking] *********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:2 Saturday 12 October 2024 21:00:04 -0400 (0:00:00.056) 0:09:59.219 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": "0", "storage_test_fstab_expected_mount_options_matches": "0", "storage_test_fstab_expected_mount_point_matches": "0", "storage_test_fstab_id_matches": [], "storage_test_fstab_mount_options_matches": [], "storage_test_fstab_mount_point_matches": [] }, "changed": false } TASK [Verify that the device identifier appears in /etc/fstab] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:17 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.183) 0:09:59.403 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the fstab mount point] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:24 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.052) 0:09:59.456 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify mount_options] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:33 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.062) 0:09:59.518 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fingerprint] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:45 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.053) 0:09:59.571 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Clean up variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fstab.yml:51 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.060) 0:09:59.632 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_fstab_expected_id_matches": null, "storage_test_fstab_expected_mount_options_matches": null, "storage_test_fstab_expected_mount_point_matches": null, "storage_test_fstab_id_matches": null, "storage_test_fstab_mount_options_matches": null, "storage_test_fstab_mount_point_matches": null }, "changed": false } TASK [Verify fs type] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:6 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.056) 0:09:59.688 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify fs label] ********************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-fs.yml:14 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.064) 0:09:59.752 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [See whether the device node is present] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:3 Saturday 12 October 2024 21:00:05 -0400 (0:00:00.066) 0:09:59.819 ****** ok: [managed-node2] => { "changed": false, "stat": { "atime": 1728781197.2862682, "attr_flags": "", "attributes": [], "block_size": 4096, "blocks": 0, "charset": "binary", "ctime": 1728781197.2862682, "dev": 5, "device_type": 2048, "executable": false, "exists": true, "gid": 6, "gr_name": "disk", "inode": 27625, "isblk": true, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": false, "isreg": false, "issock": false, "isuid": false, "mimetype": "inode/blockdevice", "mode": "0660", "mtime": 1728781197.2862682, "nlink": 1, "path": "/dev/sda", "pw_name": "root", "readable": true, "rgrp": true, "roth": false, "rusr": true, "size": 0, "uid": 0, "version": null, "wgrp": true, "woth": false, "writeable": true, "wusr": true, "xgrp": false, "xoth": false, "xusr": false } } TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:9 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.492) 0:10:00.311 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Verify the presence/absence of the device node] ************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:16 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.088) 0:10:00.399 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about this volume] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:23 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.055) 0:10:00.455 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Process volume type (set initial value) (1/2)] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:29 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.103) 0:10:00.559 ****** ok: [managed-node2] => { "ansible_facts": { "st_volume_type": "disk" }, "changed": false } TASK [Process volume type (get RAID value) (2/2)] ****************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:33 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.070) 0:10:00.629 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the volume's device type] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-device.yml:38 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.076) 0:10:00.705 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Stat the LUKS device, if encrypted] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:3 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.049) 0:10:00.755 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Ensure cryptsetup is present] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:10 Saturday 12 October 2024 21:00:06 -0400 (0:00:00.056) 0:10:00.811 ****** ok: [managed-node2] => { "changed": false, "rc": 0, "results": [ "cryptsetup-2.0.3-6.el7.x86_64 providing cryptsetup is already installed" ] } lsrpackages: cryptsetup TASK [Collect LUKS info for this volume] *************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:16 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.860) 0:10:01.672 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the presence/absence of the LUKS device node] ********************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:22 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.057) 0:10:01.730 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify that the raw device is the same as the device if not encrypted] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:29 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.051) 0:10:01.782 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Make sure we got info about the LUKS volume if encrypted] **************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:40 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.046) 0:10:01.828 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Verify the LUKS volume's device type if encrypted] *********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:46 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.081) 0:10:01.909 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS version] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:51 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.088) 0:10:01.998 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS key size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:63 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.063) 0:10:02.062 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check LUKS cipher] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:75 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.049) 0:10:02.112 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set test variables] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:87 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.069) 0:10:02.181 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": [], "_storage_test_expected_crypttab_entries": "0", "_storage_test_expected_crypttab_key_file": "-" }, "changed": false } TASK [Check for /etc/crypttab entry] ******************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:93 Saturday 12 October 2024 21:00:07 -0400 (0:00:00.075) 0:10:02.257 ****** ok: [managed-node2] => { "changed": false } MSG: All assertions passed TASK [Validate the format of the crypttab entry] ******************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:100 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.065) 0:10:02.323 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check backing device of crypttab entry] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:108 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.045) 0:10:02.368 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check key file of crypttab entry] **************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:116 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.046) 0:10:02.414 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clear test variables] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-encryption.yml:124 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.059) 0:10:02.474 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_crypttab_entries": null, "_storage_test_expected_crypttab_entries": null, "_storage_test_expected_crypttab_key_file": null }, "changed": false } TASK [Get information about RAID] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:8 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.059) 0:10:02.534 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set active devices regex] ************************************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:14 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.051) 0:10:02.585 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set spare devices regex] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:19 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.037) 0:10:02.623 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set md version regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:24 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.047) 0:10:02.671 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set chunk size regex] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:29 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.043) 0:10:02.714 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the chunk size] **************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:37 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.038) 0:10:02.753 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID active devices count] ***************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:46 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.034) 0:10:02.788 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID spare devices count] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:54 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.033) 0:10:02.822 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID metadata version] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:62 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.033) 0:10:02.855 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check RAID chunk size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-md.yml:70 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.044) 0:10:02.900 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the actual size of the volume] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:3 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.049) 0:10:02.949 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested size of the volume] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:11 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.052) 0:10:03.001 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected size] ********************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:20 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.048) 0:10:03.050 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:28 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.037) 0:10:03.088 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Get the size of parent/pool device] ************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:32 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.043) 0:10:03.131 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show test pool] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:46 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.046) 0:10:03.177 ****** skipping: [managed-node2] => {} TASK [Show test blockinfo] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:50 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.037) 0:10:03.215 ****** skipping: [managed-node2] => {} TASK [Show test pool size] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:54 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.032) 0:10:03.248 ****** skipping: [managed-node2] => {} TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:58 Saturday 12 October 2024 21:00:08 -0400 (0:00:00.031) 0:10:03.280 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default thin pool reserved space values] ********************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:67 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.032) 0:10:03.312 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default minimal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:71 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.032) 0:10:03.345 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Default maximal thin pool reserved space size] *************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:76 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.036) 0:10:03.381 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate maximum usable space in thin pool] ***************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:82 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.033) 0:10:03.415 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply upper size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:86 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.036) 0:10:03.452 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Apply lower size limit to max usable thin pool space] ******************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:91 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.063) 0:10:03.515 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Convert maximum usable thin pool space from int to Size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:96 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.042) 0:10:03.558 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show max thin pool size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:101 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.042) 0:10:03.601 ****** skipping: [managed-node2] => {} TASK [Show volume thin pool size] ********************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:105 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.049) 0:10:03.650 ****** skipping: [managed-node2] => {} TASK [Show test volume size] *************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:109 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.041) 0:10:03.691 ****** skipping: [managed-node2] => {} TASK [Establish base value for expected thin pool size] ************************ task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:113 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.034) 0:10:03.726 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected size based on pool size and percentage value] ***** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:120 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.034) 0:10:03.760 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Establish base value for expected thin pool volume size] ***************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:127 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.033) 0:10:03.793 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Calculate the expected thin pool volume size based on percentage value] *** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:131 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.037) 0:10:03.831 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Replace expected volume size with calculated value] ********************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:137 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.046) 0:10:03.877 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Show actual size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:143 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.052) 0:10:03.930 ****** ok: [managed-node2] => { "storage_test_actual_size": { "changed": false, "skip_reason": "Conditional result was False", "skipped": true } } TASK [Show expected size] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:147 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.055) 0:10:03.985 ****** ok: [managed-node2] => { "storage_test_expected_size": "4294967296" } TASK [Assert expected size is actual size] ************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-size.yml:151 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.045) 0:10:04.031 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Get information about the LV] ******************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:5 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.041) 0:10:04.073 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV segment type] ***************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:13 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.046) 0:10:04.119 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check segment type] ****************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:17 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.040) 0:10:04.160 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set LV cache size] ******************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:24 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.034) 0:10:04.194 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Parse the requested cache size] ****************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:31 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.036) 0:10:04.231 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Set expected cache size] ************************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:37 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.033) 0:10:04.265 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Check cache size] ******************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume-cache.yml:42 Saturday 12 October 2024 21:00:09 -0400 (0:00:00.034) 0:10:04.299 ****** skipping: [managed-node2] => { "changed": false, "skip_reason": "Conditional result was False" } TASK [Clean up facts] ********************************************************** task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/test-verify-volume.yml:25 Saturday 12 October 2024 21:00:10 -0400 (0:00:00.032) 0:10:04.332 ****** ok: [managed-node2] => { "ansible_facts": { "_storage_test_volume_present": null }, "changed": false } TASK [Clean up variable namespace] ********************************************* task path: /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/tests/storage/verify-role-results.yml:54 Saturday 12 October 2024 21:00:10 -0400 (0:00:00.035) 0:10:04.367 ****** ok: [managed-node2] => { "ansible_facts": { "storage_test_blkinfo": null, "storage_test_crypttab": null, "storage_test_fstab": null }, "changed": false } META: ran handlers META: ran handlers PLAY RECAP ********************************************************************* managed-node2 : ok=1224 changed=60 unreachable=0 failed=9 skipped=1064 rescued=9 ignored=0 Saturday 12 October 2024 21:00:10 -0400 (0:00:00.019) 0:10:04.387 ****** =============================================================================== fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 65.02s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 34.95s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.82s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.74s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.63s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.56s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state -- 10.04s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 9.96s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Make sure blivet is available ------- 6.33s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:2 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.71s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.46s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.22s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.21s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.19s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.19s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.18s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.14s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Manage the pools and volumes to match the specified state --- 4.12s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:69 fedora.linux_system_roles.storage : Get required packages --------------- 4.10s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19 fedora.linux_system_roles.storage : Get required packages --------------- 4.09s /tmp/collections-z2v/ansible_collections/fedora/linux_system_roles/roles/storage/tasks/main-blivet.yml:19